Recommendation system for restaurants

Based on the Yelp Dataset.

0. Libraries

First of all, we define all the libraries we need.

In [1]:
from matplotlib.ticker import PercentFormatter as _PercentFormatter
import matplotlib.pyplot as _plt
import numpy as _np
import pandas as _pd
import joblib as _jl
import glob as _glob
import os as _os
import re as _re
import time as _time
from multiprocessing import Pool as _Pool
from sklearn.preprocessing import OrdinalEncoder as _OrdinalEncoder, binarize as _binarize
from sklearn.metrics import confusion_matrix as _confusion_matrix, roc_curve as _roc_curve, classification_report as _classification_report, accuracy_score as _accuracy_score
from sklearn.model_selection import GridSearchCV as _GridSearchCV
from sklearn.svm import LinearSVC as _LinearSVC
from sklearn.ensemble import RandomForestClassifier as _RandomForestClassifier
from keras.models import Sequential as _Sequential, load_model as _load_model
from keras.layers import Dense as _Dense

_pd.set_option('display.max_columns', None)
Using TensorFlow backend.

Since we are going to use big datasets, and we'll need to load them more times, we define a commodity function that deletes all user defined variables, in order to free some memory.

In [2]:
def _del_all():
    %reset_selective -f [^_]

1. Data cleaning

Based on the kernel by Hou, Saha, Tsang.

We execute the code in recommendation_system_preprocessing.ipynb in order to clean the data and to reduce the size of the dataset, using pickles instead of json and dropping unnecessary columns.

We explore the resulting datasets:

In [5]:
dataset_list = _glob.glob("../dataset/[!checked]*.pickle")
for d in dataset_list:
    dataset = _pd.read_pickle(d)
    
    f = _os.path.splitext(_os.path.basename(d))[0]
    c = ", ".join(list(dataset.columns))
    s = dataset.shape
    
    print("Dataset '" + f + "':")
    print("\tfeatures:", c)
    print("\tshape:", s)
    print()
Dataset 'all_checkin':
	features: business_id, date
	shape: (57402, 2)

Dataset 'all_review':
	features: review_id, user_id, business_id, stars, useful, funny, cool, text, date
	shape: (4201684, 9)

Dataset 'all_tips':
	features: restaurant_name, tips_date, user_id
	shape: (770878, 3)

Dataset 'all_users':
	features: user_id, user_name, average_stars, yelping_since, review, years_of_elite, fans, useful, cool, funny, friends
	shape: (1148098, 11)

Dataset 'restaurants':
	features: name, business_id, address, cuisine, postal_code, latitude, longitude, review_count, stars, OutdoorSeating, BusinessAcceptsCreditCards, RestaurantsDelivery, RestaurantsReservations, WiFi, Alcohol, categories, city, Monday_Open, Tuesday_Open, Wednesday_Open, Thursday_Open, Friday_Open, Saturday_Open, Sunday_Open, Monday_Close, Tuesday_Close, Wednesday_Close, Thursday_Close, Friday_Close, Saturday_Close, Sunday_Close
	shape: (59371, 31)

In [10]:
_del_all()

2. Fake Review Detection

Based on Zhiwei Zhang's work and code.

Then, in order to filter out deceptive reviews, that could alter the results of our analysis, we load the model based on Support Vector Machine defined in Yelp_sentiment_analysis/Scripts/fake_reviews.ipynb by Zhiwei Zhang, that has the best scores for accuracy, precision, recall and f1-score.

In [2]:
vectorizer = _jl.load('../models/tfidf_vectorizer.joblib')
svc = _jl.load('../models/fake_review_svc_model.joblib')

Now, we can apply this model to our data.

In [4]:
review = _pd.read_pickle("../dataset/all_review.pickle")

review.head()
Out[4]:
review_id user_id business_id stars useful funny cool text date
3 yi0R0Ugj_xUx_Nek0-_Qig dacAIZ6fTM6mqwW5uxkskg ikCg8xy5JIg_NGPx-MSIDA 5 0 0 0 Went in for a lunch. Steak sandwich was delici... 2018-01-09 20:56:38
5 fdiNeiN_hoCxCMy2wTRW9g w31MKYsNFMrjhWxxAb5wIw eU_713ec6fTGNO4BegRaww 4 0 0 0 I'll be the first to admit that I was not exci... 2013-01-20 13:25:59
6 G7XHMxG0bx9oBJNECG4IFg jlu4CztcSxrKx56ba1a5AQ 3fw2X5bZYeW9xCz_zGhOHg 3 5 4 5 Tracy dessert had a big name in Hong Kong and ... 2016-05-07 01:21:02
7 8e9HxxLjjqc9ez5ezzN7iQ d6xvYpyzcfbF_AZ8vMB7QA zvO-PJCpNk4fgAVUnExYAA 1 3 1 1 This place has gone down hill. Clearly they h... 2010-10-05 19:12:35
10 kbtscdyz6lvrtGjD1quQTg FIk4lQQu1eTe2EpzQ4xhBA 8mIrX_LrOnAqWsB5JrOojQ 4 0 0 0 Like walking back in time, every Saturday morn... 2011-11-30 02:11:15
In [5]:
texts = list(review["text"])
X = vectorizer.transform(texts)
predictions = svc.predict(X)
In [6]:
print(type(predictions))
print("SVC predictions:", predictions)
<class 'numpy.ndarray'>
SVC predictions: [1 1 1 ... 1 1 1]

Now we repeat the whole process with a different model that allows us to obtain real weights instead of a binary evaluation.

In [7]:
cal_svc = _jl.load('../models/fake_review_cal_svc_model.joblib')
cal_predictions = cal_svc.predict_proba(X)
In [8]:
print("Calibrated SVC predictions:\n", cal_predictions)
cal_predictions = _np.array([x[1] for x in cal_predictions])
print("Calibrated SVC predictions for class '1':\n", cal_predictions)
Calibrated SVC predictions:
 [[0.12112422 0.87887578]
 [0.011      0.989     ]
 [0.03538102 0.96461898]
 ...
 [0.36147536 0.63852464]
 [0.00379598 0.99620402]
 [0.00133685 0.99866315]]
Calibrated SVC predictions for class '1':
 [0.87887578 0.989      0.96461898 ... 0.63852464 0.99620402 0.99866315]
In [9]:
print("columns before:\n", review.columns)
checked_review = review.assign(bin_truth_score=predictions, real_truth_score=cal_predictions)
print("columns after:\n", checked_review.columns)
columns before:
 Index(['review_id', 'user_id', 'business_id', 'stars', 'useful', 'funny',
       'cool', 'text', 'date'],
      dtype='object')
columns after:
 Index(['review_id', 'user_id', 'business_id', 'stars', 'useful', 'funny',
       'cool', 'text', 'date', 'bin_truth_score', 'real_truth_score'],
      dtype='object')

Let's see what we just obtained.

In [10]:
checked_review[['review_id', 'text', 'bin_truth_score', 'real_truth_score']].head()
Out[10]:
review_id text bin_truth_score real_truth_score
3 yi0R0Ugj_xUx_Nek0-_Qig Went in for a lunch. Steak sandwich was delici... 1 0.878876
5 fdiNeiN_hoCxCMy2wTRW9g I'll be the first to admit that I was not exci... 1 0.989000
6 G7XHMxG0bx9oBJNECG4IFg Tracy dessert had a big name in Hong Kong and ... 1 0.964619
7 8e9HxxLjjqc9ez5ezzN7iQ This place has gone down hill. Clearly they h... 1 0.867897
10 kbtscdyz6lvrtGjD1quQTg Like walking back in time, every Saturday morn... 1 0.975404
In [11]:
data = checked_review['bin_truth_score']
_plt.hist(data, weights=_np.ones(len(data)) / len(data))
_plt.title("SVC labels distribution")
_plt.gca().yaxis.set_major_formatter(_PercentFormatter(1))
_plt.show()
In [12]:
data = checked_review['real_truth_score']
_plt.hist(data, weights=_np.ones(len(data)) / len(data))
_plt.title("Calibrated SVC labels distribution")
_plt.gca().yaxis.set_major_formatter(_PercentFormatter(1))
_plt.show()

Finally, we can save the new dataset without the text column, in order to save space and computation time.

In [17]:
checked_review.drop(columns=['text'], inplace=True)
checked_review.to_pickle('../dataset/checked_review.pickle')

Check that everything has worked properly.

In [18]:
final_review = _pd.read_pickle('../dataset/checked_review.pickle')
print(final_review.columns)
final_review.head()
Index(['review_id', 'user_id', 'business_id', 'stars', 'useful', 'funny',
       'cool', 'date', 'bin_truth_score', 'real_truth_score'],
      dtype='object')
Out[18]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score
3 yi0R0Ugj_xUx_Nek0-_Qig dacAIZ6fTM6mqwW5uxkskg ikCg8xy5JIg_NGPx-MSIDA 5 0 0 0 2018-01-09 20:56:38 1 0.878876
5 fdiNeiN_hoCxCMy2wTRW9g w31MKYsNFMrjhWxxAb5wIw eU_713ec6fTGNO4BegRaww 4 0 0 0 2013-01-20 13:25:59 1 0.989000
6 G7XHMxG0bx9oBJNECG4IFg jlu4CztcSxrKx56ba1a5AQ 3fw2X5bZYeW9xCz_zGhOHg 3 5 4 5 2016-05-07 01:21:02 1 0.964619
7 8e9HxxLjjqc9ez5ezzN7iQ d6xvYpyzcfbF_AZ8vMB7QA zvO-PJCpNk4fgAVUnExYAA 1 3 1 1 2010-10-05 19:12:35 1 0.867897
10 kbtscdyz6lvrtGjD1quQTg FIk4lQQu1eTe2EpzQ4xhBA 8mIrX_LrOnAqWsB5JrOojQ 4 0 0 0 2011-11-30 02:11:15 1 0.975404
In [ ]:
_del_all()

3. Historical features

Following this paper, we add some historical features to our dataset:

  1. user-level features:
    1.1. average of the ratings given by a certain user,
    1.2. number of reviews written by a certain user,
  2. business-level features:
    2.1. average of the ratings given to a certain restaurant,
    2.2. number of reviews written about a certain restaurant,
  3. user-business features:
    3.1. average rating given by a certain user to each category,
    3.2. average of the ratings given by a certain user to the categories of a certain restaurant.

Before proceeding with the computation of the new features, we have to split the dataset in three parts:

  1. Test set, from the last day considered in the dataset, to the previous M months;
  2. Training set, from the day before the beginning of the test set, up to N months before;
  3. History, the remaining part of the dataset, used to compute historical features.

We pick m=3 and n=8, so the test set goes from 9/1/2018 to 11/30/2018, the training set goes from 1/1/2018 to 8/31/2018, the history contains the remaining data, from 10/12/2004 to 12/31/2017.

In [5]:
review_all = _pd.read_pickle("../dataset/checked_review.pickle")
review_test = review_all[review_all['date']>=_np.datetime64('2018-09-01')]
review_train = review_all[(review_all['date']>=_np.datetime64('2018-01-01')) & (review_all['date']<_np.datetime64('2018-09-01'))]
# review_hist = review_all[review_all['date']<_np.datetime64('2018-01-01')]

review_test.to_pickle('../dataset/m2_n9/review_test.pickle')
review_train.to_pickle('../dataset/m2_n9/review_train.pickle')
# review_hist.to_pickle('../dataset/m2_n9/review_hist.pickle')
In [13]:
tips_all = _pd.read_pickle("../dataset/all_tips.pickle")
tips_test = tips_all[tips_all['tips_date']>=_np.datetime64('2018-10-01')]
tips_train = tips_all[(tips_all['tips_date']>=_np.datetime64('2018-01-01')) & (tips_all['tips_date']<_np.datetime64('2018-10-01'))]
tips_hist = tips_all[tips_all['tips_date']<_np.datetime64('2018-01-01')]

tips_test.to_pickle('../dataset/m2_n9/tips_test.pickle')
tips_train.to_pickle('../dataset/m2_n9/tips_train.pickle')
tips_hist.to_pickle('../dataset/m2_n9/tips_hist.pickle')
In [18]:
_del_all()

User-level features

In [3]:
review_hist = _pd.read_pickle('../dataset/m2_n9/review_hist.pickle')
users = _pd.read_pickle("../dataset/all_users.pickle")
In [8]:
avg_stars = review_hist['stars'].mean()

users = users.assign(average_stars=avg_stars)
users = users.assign(num_reviews=0)
users = users.assign(average_stars_bin=avg_stars)
users = users.assign(num_reviews_bin=0)
users = users.assign(average_stars_real=avg_stars)
users = users.assign(num_reviews_real=0)
users = users.set_index('user_id')
users.head()
Out[8]:
user_name average_stars yelping_since review years_of_elite fans useful cool funny friends num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
user_id
l6BmjZMeQD3rDxWUbiAiow Rashmi 3.703313 2013-10-08 95 3 5 84 25 17 2374 0 3.703313 0 3.703313 0
4XChL029mKr5hydo79Ljxg Jenna 3.703313 2013-02-21 33 0 4 48 16 22 27646 0 3.703313 0 3.703313 0
bc8C_eETBWL0olvFSJJd0w David 3.703313 2013-10-04 16 0 0 28 10 8 358 0 3.703313 0 3.703313 0
dD0gZpBctWGdWo9WlGuhlA Angela 3.703313 2014-05-22 17 0 5 30 14 4 12598 0 3.703313 0 3.703313 0
MM4RJAeH6yuaN8oZDSt0RA Nancy 3.703313 2013-10-23 361 4 39 1114 665 279 5542 0 3.703313 0 3.703313 0
In [3]:
def _f(grouped):
    d = {}
    
    d['num'] = grouped['stars'].size
    d['stars'] = grouped['stars'].mean()
    
    non_fake = _np.ma.masked_where(grouped['bin_truth_score']<=0, grouped['stars']).compressed()
    d['num_bin'] = non_fake.size
    d['stars_bin'] = non_fake.mean()
    
    d['num_real'] = grouped['real_truth_score'].sum()
    d['stars_real'] = _np.average(grouped['stars'], weights=grouped['real_truth_score'])
    
    return _pd.Series(d, index=['num', 'stars', 'num_bin', 'stars_bin', 'num_real', 'stars_real'])
In [ ]:
grouped_reviews = review_hist.groupby('user_id').apply(_f)
grouped_reviews.head()
In [18]:
import random
import statistics

current_milli_time = lambda: int(round(_time.time() * 1000))

def get_time(df):
    us_id = random.choice(grouped_reviews.index)
    x = random.randrange(1000)
    t = current_milli_time()
    df.loc[us_id, ["test"]] = x
    t0 = current_milli_time()
    return t0-t

def get_time_mul(df):
    us_id = random.choice(grouped_reviews.index)
    x = random.randrange(1000)
    y = random.randrange(1000)
    z = random.randrange(1000)
    t = current_milli_time()
    df.loc[us_id, ["test", "ciao", "prova"]] = [x, y, z]
    t0 = current_milli_time()
    return t0-t

def test():
    df = users.copy()
    df['test'] = -1
    times = []
    for i in range(1000):
         times += [get_time(df)]
    avg_time = statistics.mean(times)
    del df
    return avg_time

def test_mul():
    df = users.copy()
    df['test'] = -1
    df['ciao'] = -1
    df['prova'] = -1
    times = []
    for i in range(1000):
         times += [get_time(df)]
    avg_time = statistics.mean(times)
    del df
    return avg_time

def tot_time(ops, x, k):
    time_millis = ops * k * x
    hours = time_millis/1000/60/60
    return hours

tot = len(grouped_reviews)
x = test()
print("hours:", tot_time(tot, x, 6))
x = test_mul()
print("hours mul:", tot_time(tot, x, 1))
hours: 53.065662455
hours mul: 8.774814544166667
In [9]:
count = 1
tot = len(grouped_reviews)
print("tot:", tot)

for index, row in grouped_reviews.iterrows():
    uid = index
    num = row['num']
    stars = row['stars']
    num_bin = row['num_bin']
    stars_bin = row['stars_bin']
    num_real = row['num_real']
    stars_real = row['stars_real']
    
    cols = ["num_reviews", "average_stars", "num_reviews_bin",
            "average_stars_bin", "num_reviews_real", "average_stars_real"]
    vals = [num, stars, num_bin, stars_bin, num_real, stars_real]
    users.loc[uid, cols] = vals
    
    count += 1
    if count % 1000 == 0:
        percent = (count/tot)*100
        print("row {}/{} - {}%".format(count, tot, percent))
tot: 954447
row 1000/954447 - 0.10477271131870078%
row 2000/954447 - 0.20954542263740156%
row 3000/954447 - 0.31431813395610236%
row 4000/954447 - 0.4190908452748031%
row 5000/954447 - 0.5238635565935038%
row 6000/954447 - 0.6286362679122047%
row 7000/954447 - 0.7334089792309054%
row 8000/954447 - 0.8381816905496062%
row 9000/954447 - 0.942954401868307%
row 10000/954447 - 1.0477271131870076%
[...]
row 945000/954447 - 99.01021219617223%
row 946000/954447 - 99.11498490749094%
row 947000/954447 - 99.21975761880964%
row 948000/954447 - 99.32453033012834%
row 949000/954447 - 99.42930304144704%
row 950000/954447 - 99.53407575276574%
row 951000/954447 - 99.63884846408445%
row 952000/954447 - 99.74362117540313%
row 953000/954447 - 99.84839388672184%
row 954000/954447 - 99.95316659804054%
In [10]:
users = users.reset_index()
users.to_pickle('../dataset/m2_n9/users.pickle')
_del_all()

Business-level features

In [4]:
restaurants = _pd.read_pickle("../dataset/restaurants.pickle")
review_hist = _pd.read_pickle('../dataset/m2_n9/review_hist.pickle')
avg_stars = review_hist['stars'].mean()
In [5]:
restaurants = restaurants.assign(average_stars=avg_stars)
restaurants = restaurants.assign(num_reviews=0)
restaurants = restaurants.assign(average_stars_bin=avg_stars)
restaurants = restaurants.assign(num_reviews_bin=0)
restaurants = restaurants.assign(average_stars_real=avg_stars)
restaurants = restaurants.assign(num_reviews_real=0)
restaurants = restaurants.set_index('business_id')
restaurants.head()
Out[5]:
name address cuisine postal_code latitude longitude review_count stars OutdoorSeating BusinessAcceptsCreditCards ... Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
business_id
QXAEGFB4oINsVuTFxEYKFQ Emerald Chinese Restaurant 30 Eglinton Avenue W Chinese L5R 3E7 43.605499 -79.652289 128 2.5 False NaN ... 00:00:00 01:00:00 01:00:00 00:00:00 3.703313 0 3.703313 0 3.703313 0
gnKjwL_1w79qoiV3IC_xQQ Musashi Japanese Restaurant 10110 Johnston Rd, Ste 15 Japanese 28210 35.092564 -80.859132 170 4.0 False True ... 21:30:00 22:00:00 22:00:00 21:00:00 3.703313 0 3.703313 0 3.703313 0
1Dfx3zM-rW4n-31KeC8sJg Taco Bell 2450 E Indian School Rd Mexican 85016 33.495194 -112.028588 18 3.0 False True ... 01:00:00 01:00:00 01:00:00 00:00:00 3.703313 0 3.703313 0 3.703313 0
fweCYi8FmbJXHCqLnwuk8w Marco's Pizza 5981 Andrews Rd Italian 44060 41.708520 -81.359556 16 4.0 False True ... 00:00:00 01:00:00 01:00:00 00:00:00 3.703313 0 3.703313 0 3.703313 0
PZ-LZzSlhSe9utkQYU8pFg Carluccio's Tivoli Gardens 1775 E Tropicana Ave, Ste 29 Italian 89119 36.100016 -115.128529 40 4.0 False True ... NaT NaT NaT NaT 3.703313 0 3.703313 0 3.703313 0

5 rows × 36 columns

In [6]:
grouped_reviews = review_hist.groupby('business_id').apply(_f)
grouped_reviews.head()
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\ipykernel_launcher.py:10: RuntimeWarning: Mean of empty slice.
  # Remove the CWD from sys.path while we load stuff.
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\numpy\core\_methods.py:80: RuntimeWarning: invalid value encountered in double_scalars
  ret = ret.dtype.type(ret / rcount)
Out[6]:
num stars num_bin stars_bin num_real stars_real
business_id
--1UhMGODdWsrMastO9DZw 22.0 4.227273 14.0 4.214286 16.978214 4.267477
--6MefnULPED_I942VcFNA 38.0 3.157895 32.0 3.218750 33.948759 3.218815
--9e1ONYQuAa-CB_Rrw7Tw 1504.0 4.107048 1216.0 4.088816 1205.738732 4.092415
--DaPTJW3-tB1vP-PfdTEg 40.0 3.650000 33.0 3.575758 34.780500 3.642790
--FBCX-N37CMYDfs790Bnw 122.0 3.737705 90.0 3.611111 93.369750 3.646290
In [7]:
count = 1
tot = len(grouped_reviews)
print("tot:", tot)

for index, row in grouped_reviews.iterrows():
    uid = index
    num = row['num']
    stars = row['stars']
    num_bin = row['num_bin']
    stars_bin = row['stars_bin']
    num_real = row['num_real']
    stars_real = row['stars_real']
    
    cols = ["num_reviews", "average_stars", "num_reviews_bin",
            "average_stars_bin", "num_reviews_real", "average_stars_real"]
    vals = [num, stars, num_bin, stars_bin, num_real, stars_real]
    restaurants.loc[uid, cols] = vals
    
    count += 1
    if count % 1000 == 0:
        percent = (count/tot)*100
        print("row {}/{} - {}%".format(count, tot, percent))
tot: 56850
row 1000/56850 - 1.759014951627089%
row 2000/56850 - 3.518029903254178%
row 3000/56850 - 5.277044854881266%
row 4000/56850 - 7.036059806508356%
row 5000/56850 - 8.795074758135444%
row 6000/56850 - 10.554089709762533%
row 7000/56850 - 12.313104661389621%
row 8000/56850 - 14.072119613016712%
row 9000/56850 - 15.8311345646438%
row 10000/56850 - 17.590149516270888%
[...]
row 47000/56850 - 82.67370272647318%
row 48000/56850 - 84.43271767810026%
row 49000/56850 - 86.19173262972735%
row 50000/56850 - 87.95074758135443%
row 51000/56850 - 89.70976253298153%
row 52000/56850 - 91.46877748460862%
row 53000/56850 - 93.22779243623572%
row 54000/56850 - 94.9868073878628%
row 55000/56850 - 96.74582233948989%
row 56000/56850 - 98.50483729111697%
In [8]:
restaurants = restaurants.reset_index()
restaurants.to_pickle('../dataset/m2_n9/restaurants.pickle')
_del_all()

User - Business level features

3.1. average rating given by a certain user to each category

In [2]:
restaurants = _pd.read_pickle('../dataset/m2_n9/restaurants.pickle')
restaurants.head()
Out[2]:
business_id name address cuisine postal_code latitude longitude review_count stars OutdoorSeating ... Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
0 QXAEGFB4oINsVuTFxEYKFQ Emerald Chinese Restaurant 30 Eglinton Avenue W Chinese L5R 3E7 43.605499 -79.652289 128 2.5 False ... 00:00:00 01:00:00 01:00:00 00:00:00 2.726496 117.0 2.718750 96.0 2.730197 95.873087
1 gnKjwL_1w79qoiV3IC_xQQ Musashi Japanese Restaurant 10110 Johnston Rd, Ste 15 Japanese 28210 35.092564 -80.859132 170 4.0 False ... 21:30:00 22:00:00 22:00:00 21:00:00 4.063291 158.0 4.094203 138.0 4.067541 139.112078
2 1Dfx3zM-rW4n-31KeC8sJg Taco Bell 2450 E Indian School Rd Mexican 85016 33.495194 -112.028588 18 3.0 False ... 01:00:00 01:00:00 01:00:00 00:00:00 3.125000 16.0 2.769231 13.0 2.847327 12.604125
3 fweCYi8FmbJXHCqLnwuk8w Marco's Pizza 5981 Andrews Rd Italian 44060 41.708520 -81.359556 16 4.0 False ... 00:00:00 01:00:00 01:00:00 00:00:00 4.230769 13.0 4.166667 12.0 4.142021 10.965903
4 PZ-LZzSlhSe9utkQYU8pFg Carluccio's Tivoli Gardens 1775 E Tropicana Ave, Ste 29 Italian 89119 36.100016 -115.128529 40 4.0 False ... NaT NaT NaT NaT 4.097561 41.0 4.212121 33.0 4.167159 33.655622

5 rows × 37 columns

In [5]:
restaurants.columns
Out[5]:
Index(['business_id', 'name', 'address', 'cuisine', 'postal_code', 'latitude',
       'longitude', 'review_count', 'stars', 'OutdoorSeating',
       'BusinessAcceptsCreditCards', 'RestaurantsDelivery',
       'RestaurantsReservations', 'WiFi', 'Alcohol', 'categories', 'city',
       'Monday_Open', 'Tuesday_Open', 'Wednesday_Open', 'Thursday_Open',
       'Friday_Open', 'Saturday_Open', 'Sunday_Open', 'Monday_Close',
       'Tuesday_Close', 'Wednesday_Close', 'Thursday_Close', 'Friday_Close',
       'Saturday_Close', 'Sunday_Close', 'average_stars', 'num_reviews',
       'average_stars_bin', 'num_reviews_bin', 'average_stars_real',
       'num_reviews_real'],
      dtype='object')
In [3]:
review_hist = _pd.read_pickle('../dataset/m2_n9/review_hist.pickle')
review_hist.head()
Out[3]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score
5 fdiNeiN_hoCxCMy2wTRW9g w31MKYsNFMrjhWxxAb5wIw eU_713ec6fTGNO4BegRaww 4 0 0 0 2013-01-20 13:25:59 1 0.989000
6 G7XHMxG0bx9oBJNECG4IFg jlu4CztcSxrKx56ba1a5AQ 3fw2X5bZYeW9xCz_zGhOHg 3 5 4 5 2016-05-07 01:21:02 1 0.964619
7 8e9HxxLjjqc9ez5ezzN7iQ d6xvYpyzcfbF_AZ8vMB7QA zvO-PJCpNk4fgAVUnExYAA 1 3 1 1 2010-10-05 19:12:35 1 0.867897
10 kbtscdyz6lvrtGjD1quQTg FIk4lQQu1eTe2EpzQ4xhBA 8mIrX_LrOnAqWsB5JrOojQ 4 0 0 0 2011-11-30 02:11:15 1 0.975404
11 -I5umRTkhw15RqpKMl_o1Q -mA3-1mN4JIEkqOtdbNXCQ mRUVMJkUGxrByzMQ2MuOpA 1 0 1 0 2017-12-15 23:27:08 1 0.972826
In [4]:
joined_reviews = review_hist.join(restaurants.set_index('business_id'), on = 'business_id', lsuffix='_review', rsuffix='_rest')
joined_reviews.head()
Out[4]:
review_id user_id business_id stars_review useful funny cool date bin_truth_score real_truth_score ... Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
5 fdiNeiN_hoCxCMy2wTRW9g w31MKYsNFMrjhWxxAb5wIw eU_713ec6fTGNO4BegRaww 4 0 0 0 2013-01-20 13:25:59 1 0.989000 ... 22:00:00 23:00:00 23:00:00 NaT 3.642276 123.0 3.594059 101.0 3.596369 99.505441
6 G7XHMxG0bx9oBJNECG4IFg jlu4CztcSxrKx56ba1a5AQ 3fw2X5bZYeW9xCz_zGhOHg 3 5 4 5 2016-05-07 01:21:02 1 0.964619 ... 00:00:00 02:00:00 02:00:00 00:00:00 3.292453 106.0 3.320000 100.0 3.289493 94.480786
7 8e9HxxLjjqc9ez5ezzN7iQ d6xvYpyzcfbF_AZ8vMB7QA zvO-PJCpNk4fgAVUnExYAA 1 3 1 1 2010-10-05 19:12:35 1 0.867897 ... 22:00:00 23:00:00 23:00:00 23:00:00 2.655172 29.0 2.875000 24.0 2.727490 23.413530
10 kbtscdyz6lvrtGjD1quQTg FIk4lQQu1eTe2EpzQ4xhBA 8mIrX_LrOnAqWsB5JrOojQ 4 0 0 0 2011-11-30 02:11:15 1 0.975404 ... 23:00:00 00:00:00 00:00:00 23:00:00 4.491259 1144.0 4.499535 1075.0 4.502904 1031.313358
11 -I5umRTkhw15RqpKMl_o1Q -mA3-1mN4JIEkqOtdbNXCQ mRUVMJkUGxrByzMQ2MuOpA 1 0 1 0 2017-12-15 23:27:08 1 0.972826 ... 00:00:00 02:00:00 02:00:00 00:00:00 3.666667 12.0 3.833333 6.0 3.650460 7.482296

5 rows × 46 columns

In [8]:
categories = ', '.join(list(restaurants['categories'].unique()))
categories = categories.split(', ')
print(len(categories))

cat = []
for h in categories:
    if h not in cat:
        cat.append(h)
        
print(len(cat))

cuisines = ', '.join(list(restaurants['cuisine'].unique()))
cuisines = cuisines.split(', ')
print(len(cuisines))

_cuisines_unique = []
for cuisine in cuisines:
    if not cuisine in _cuisines_unique:
        _cuisines_unique.append(cuisine)
        
print("Number of cuisines: {0}".format(len(_cuisines_unique)))
print(_cuisines_unique)
173884
761
249
Number of cuisines: 10
['Chinese', 'Japanese', 'Mexican', 'Italian', 'Others', 'American', 'Korean', 'Mediterranean', 'Thai', 'Asian Fusion']
In [5]:
joined_reviews.to_pickle('../dataset/m2_n9/join_restaurants_reviewhist.pickle')
In [ ]:
_del_all()

Checkpoint

In [ ]:
joined_reviews =  _pd.read_pickle('../dataset/m2_n9/join_restaurants_reviewhist.pickle')
joined_reviews.head()
Out[ ]:
index review_id user_id business_id stars_review useful funny cool date bin_truth_score ... Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
0 5 fdiNeiN_hoCxCMy2wTRW9g w31MKYsNFMrjhWxxAb5wIw eU_713ec6fTGNO4BegRaww 4 0 0 0 2013-01-20 13:25:59 1 ... 22:00:00 23:00:00 23:00:00 NaT 3.642276 123.0 3.594059 101.0 3.596369 99.505441
1 6 G7XHMxG0bx9oBJNECG4IFg jlu4CztcSxrKx56ba1a5AQ 3fw2X5bZYeW9xCz_zGhOHg 3 5 4 5 2016-05-07 01:21:02 1 ... 00:00:00 02:00:00 02:00:00 00:00:00 3.292453 106.0 3.320000 100.0 3.289493 94.480786
2 7 8e9HxxLjjqc9ez5ezzN7iQ d6xvYpyzcfbF_AZ8vMB7QA zvO-PJCpNk4fgAVUnExYAA 1 3 1 1 2010-10-05 19:12:35 1 ... 22:00:00 23:00:00 23:00:00 23:00:00 2.655172 29.0 2.875000 24.0 2.727490 23.413530
3 10 kbtscdyz6lvrtGjD1quQTg FIk4lQQu1eTe2EpzQ4xhBA 8mIrX_LrOnAqWsB5JrOojQ 4 0 0 0 2011-11-30 02:11:15 1 ... 23:00:00 00:00:00 00:00:00 23:00:00 4.491259 1144.0 4.499535 1075.0 4.502904 1031.313358
4 11 -I5umRTkhw15RqpKMl_o1Q -mA3-1mN4JIEkqOtdbNXCQ mRUVMJkUGxrByzMQ2MuOpA 1 0 1 0 2017-12-15 23:27:08 1 ... 00:00:00 02:00:00 02:00:00 00:00:00 3.666667 12.0 3.833333 6.0 3.650460 7.482296

5 rows × 47 columns

In [ ]:
# joined_reviews = joined_reviews.reset_index()
joined_reviews = joined_reviews[['review_id', 'user_id', 'business_id', 'bin_truth_score', 'real_truth_score', 'cuisine', 'stars_review']]
joined_reviews.head()
Out[ ]:
review_id user_id business_id bin_truth_score real_truth_score cuisine stars_review
0 fdiNeiN_hoCxCMy2wTRW9g w31MKYsNFMrjhWxxAb5wIw eU_713ec6fTGNO4BegRaww 1 0.989000 Italian 4
1 G7XHMxG0bx9oBJNECG4IFg jlu4CztcSxrKx56ba1a5AQ 3fw2X5bZYeW9xCz_zGhOHg 1 0.964619 Chinese 3
2 8e9HxxLjjqc9ez5ezzN7iQ d6xvYpyzcfbF_AZ8vMB7QA zvO-PJCpNk4fgAVUnExYAA 1 0.867897 American 1
3 kbtscdyz6lvrtGjD1quQTg FIk4lQQu1eTe2EpzQ4xhBA 8mIrX_LrOnAqWsB5JrOojQ 1 0.975404 Others 4
4 -I5umRTkhw15RqpKMl_o1Q -mA3-1mN4JIEkqOtdbNXCQ mRUVMJkUGxrByzMQ2MuOpA 1 0.972826 American 1
In [ ]:
#cuisines_unique = ['Chinese', 'Japanese', 'Mexican', 'Italian', 'Others', 'American', 'Korean', 'Mediterranean', 'Thai', 'Asian Fusion']
In [ ]:
def each_cuisine_ratings(grouped):
    d = {}
    index = []
    for cuisine in _cuisines_unique:
        cuisine_av = cuisine + "_av"
        cuisine_records = _np.ma.masked_where(~grouped['cuisine'].str.contains(cuisine), grouped['stars_review']).compressed()
        d[cuisine_av] = cuisine_records.mean()
        index.append(cuisine_av)
    # print("cuisine_av done")
        
    for cuisine in _cuisines_unique:
        cuisine_av_bin = cuisine + "_av_bin"
        #non_fake = _np.ma.masked_where(grouped['bin_truth_score'] < 0, grouped).compressed()
        non_fake = grouped[grouped['bin_truth_score'] > 0]
        cuisine_records = _np.ma.masked_where(~non_fake['cuisine'].str.contains(cuisine), non_fake['stars_review']).compressed()
        d[cuisine_av_bin] = cuisine_records.mean()
        index.append(cuisine_av_bin)
    # print("cuisine_av_bin done")
    
    for cuisine in _cuisines_unique:
        cuisine_av_real = cuisine + "_av_real"
        cuisine_records = _np.ma.masked_where(~grouped['cuisine'].str.contains(cuisine), grouped['stars_review']).compressed()
        cuisine_truth_score = _np.ma.masked_where(~grouped['cuisine'].str.contains(cuisine), grouped['real_truth_score']).compressed()
        d[cuisine_av_real] = _np.ma.average(cuisine_records, weights = cuisine_truth_score)
        index.append(cuisine_av_real)
    # print("cuisine_av_real done")
    
    return _pd.Series(d, index = index)
    
In [ ]:
grouped_reviews = joined_reviews.groupby('user_id').apply(each_cuisine_ratings)
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\ipykernel_launcher.py:8: RuntimeWarning: Mean of empty slice.
  
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\numpy\core\_methods.py:80: RuntimeWarning: invalid value encountered in double_scalars
  ret = ret.dtype.type(ret / rcount)
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\ipykernel_launcher.py:17: RuntimeWarning: Mean of empty slice.
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\numpy\ma\extras.py:607: RuntimeWarning: invalid value encountered in double_scalars
  avg = np.multiply(a, wgt, dtype=result_dtype).sum(axis)/scl
In [ ]:
grouped_reviews.head()
Out[ ]:
Chinese_av Japanese_av Mexican_av Italian_av Others_av American_av Korean_av Mediterranean_av Thai_av Asian Fusion_av ... Chinese_av_real Japanese_av_real Mexican_av_real Italian_av_real Others_av_real American_av_real Korean_av_real Mediterranean_av_real Thai_av_real Asian Fusion_av_real
user_id
---1lKK3aKOuomHnwAkAow NaN 2.333333 3.666667 3.166667 4.352941 3.9375 4.0 NaN 3.25 3.5 ... NaN 2.035809 3.720609 3.226154 4.524134 3.963953 4.0 NaN 3.134858 3.443446
---PLwSf5gKdIoVnyRHgBA NaN NaN NaN NaN 3.000000 NaN NaN NaN NaN NaN ... NaN NaN NaN NaN 3.000000 NaN NaN NaN NaN NaN
---cu1hq55BP9DWVXXKHZg NaN NaN 4.000000 NaN NaN NaN NaN NaN NaN 1.0 ... NaN NaN 4.000000 NaN NaN NaN NaN NaN NaN 1.000000
---udAKDsn0yQXmzbWQNSw NaN NaN NaN NaN 5.000000 NaN NaN NaN 4.00 NaN ... NaN NaN NaN NaN 5.000000 NaN NaN NaN 4.000000 NaN
--0RtXvcOIE4XbErYca6Rw NaN NaN NaN NaN 4.000000 NaN NaN NaN NaN NaN ... NaN NaN NaN NaN 4.000000 NaN NaN NaN NaN NaN

5 rows × 30 columns

Checkpoint 2

In [3]:
users = _pd.read_pickle('../dataset/m2_n9/users.pickle')
users.head()
Out[3]:
user_id user_name average_stars yelping_since review years_of_elite fans useful cool funny friends num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
0 l6BmjZMeQD3rDxWUbiAiow Rashmi 3.000000 2013-10-08 95 3 5 84 25 17 2374 2.0 3.000000 2.0 3.017247 1.957334
1 4XChL029mKr5hydo79Ljxg Jenna 3.500000 2013-02-21 33 0 4 48 16 22 27646 12.0 3.777778 9.0 3.626019 9.280533
2 bc8C_eETBWL0olvFSJJd0w David 3.384615 2013-10-04 16 0 0 28 10 8 358 13.0 3.500000 12.0 3.326528 10.350275
3 dD0gZpBctWGdWo9WlGuhlA Angela 5.000000 2014-05-22 17 0 5 30 14 4 12598 1.0 NaN 0.0 5.000000 0.329341
4 MM4RJAeH6yuaN8oZDSt0RA Nancy 4.400000 2013-10-23 361 4 39 1114 665 279 5542 5.0 4.400000 5.0 4.378799 4.720870
In [4]:
users = users.assign(av_rat_chinese_cuisine = _np.nan, av_rat_japanese_cuisine = _np.nan, av_rat_mexican_cuisine = _np.nan, 
                     av_rat_italian_cuisine = _np.nan, av_rat_others_cuisine = _np.nan, av_rat_american_cuisine = _np.nan, 
                     av_rat_korean_cuisine = _np.nan, av_rat_mediterranean_cuisine = _np.nan, av_rat_thai_cuisine = _np.nan, 
                     av_rat_asianfusion_cuisine = _np.nan)

users = users.assign(av_rat_chinese_cuisine_bin = _np.nan, av_rat_japanese_cuisine_bin = _np.nan, av_rat_mexican_cuisine_bin = _np.nan, 
                     av_rat_italian_cuisine_bin = _np.nan, av_rat_others_cuisine_bin = _np.nan, av_rat_american_cuisine_bin = _np.nan, 
                     av_rat_korean_cuisine_bin = _np.nan, av_rat_mediterranean_cuisine_bin = _np.nan, av_rat_thai_cuisine_bin = _np.nan, 
                     av_rat_asianfusion_cuisine_bin = _np.nan)

users = users.assign(av_rat_chinese_cuisine_real = _np.nan, av_rat_japanese_cuisine_real = _np.nan, av_rat_mexican_cuisine_real = _np.nan, 
                     av_rat_italian_cuisine_real = _np.nan, av_rat_others_cuisine_real = _np.nan, av_rat_american_cuisine_real = _np.nan, 
                     av_rat_korean_cuisine_real = _np.nan, av_rat_mediterranean_cuisine_real = _np.nan, av_rat_thai_cuisine_real = _np.nan, 
                     av_rat_asianfusion_cuisine_real = _np.nan)

users = users.set_index('user_id')
In [5]:
users.head()
Out[5]:
user_name average_stars yelping_since review years_of_elite fans useful cool funny friends ... av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
user_id
l6BmjZMeQD3rDxWUbiAiow Rashmi 3.000000 2013-10-08 95 3 5 84 25 17 2374 ... NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
4XChL029mKr5hydo79Ljxg Jenna 3.500000 2013-02-21 33 0 4 48 16 22 27646 ... NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
bc8C_eETBWL0olvFSJJd0w David 3.384615 2013-10-04 16 0 0 28 10 8 358 ... NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
dD0gZpBctWGdWo9WlGuhlA Angela 5.000000 2014-05-22 17 0 5 30 14 4 12598 ... NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
MM4RJAeH6yuaN8oZDSt0RA Nancy 4.400000 2013-10-23 361 4 39 1114 665 279 5542 ... NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN

5 rows × 45 columns

In [3]:
grouped_reviews = _pd.read_pickle('../dataset/m2_n9/grouped_reviews.pickle')
grouped_reviews.head()
Out[3]:
Chinese_av Japanese_av Mexican_av Italian_av Others_av American_av Korean_av Mediterranean_av Thai_av Asian Fusion_av ... Chinese_av_real Japanese_av_real Mexican_av_real Italian_av_real Others_av_real American_av_real Korean_av_real Mediterranean_av_real Thai_av_real Asian Fusion_av_real
user_id
---1lKK3aKOuomHnwAkAow NaN 2.333333 3.666667 3.166667 4.352941 3.9375 4.0 NaN 3.25 3.5 ... NaN 2.035809 3.720609 3.226154 4.524134 3.963953 4.0 NaN 3.134858 3.443446
---PLwSf5gKdIoVnyRHgBA NaN NaN NaN NaN 3.000000 NaN NaN NaN NaN NaN ... NaN NaN NaN NaN 3.000000 NaN NaN NaN NaN NaN
---cu1hq55BP9DWVXXKHZg NaN NaN 4.000000 NaN NaN NaN NaN NaN NaN 1.0 ... NaN NaN 4.000000 NaN NaN NaN NaN NaN NaN 1.000000
---udAKDsn0yQXmzbWQNSw NaN NaN NaN NaN 5.000000 NaN NaN NaN 4.00 NaN ... NaN NaN NaN NaN 5.000000 NaN NaN NaN 4.000000 NaN
--0RtXvcOIE4XbErYca6Rw NaN NaN NaN NaN 4.000000 NaN NaN NaN NaN NaN ... NaN NaN NaN NaN 4.000000 NaN NaN NaN NaN NaN

5 rows × 30 columns

In [8]:
# split grouped_reviews and users datasets into n_cores parts, where n_cores is the number of available processors
n_cores = _os.cpu_count()

df_out = _np.array_split(users, n_cores)   # list of input dataframes (from users)

df_out_names = []   # list of paths of output dataframes (from grouped_reviews)
df_in = []
for i, df in enumerate(df_out):
    name = "../dataset/m2_n9/tmp/df_out_" + str(i) + ".pickle"
    df_out_names += [name]
    
    df_tmp = grouped_reviews.loc[df.index]
    df_in += [df_tmp]
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\ipykernel_launcher.py:12: FutureWarning: 
Passing list-likes to .loc or [] with any missing label will raise
KeyError in the future, you can use .reindex() as an alternative.

See the documentation here:
https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#deprecate-loc-reindex-listlike
  if sys.path[0] == '':
In [9]:
from multiproc_utils import user_business_features

if __name__ ==  '__main__':
    with _Pool(processes=n_cores) as p:
        p.map(user_business_features, zip(df_in, df_out, df_out_names))
In [10]:
users_chunks = []

# add chunks produced by subprocesses
for name in df_out_names:
    df_out_i = _pd.read_pickle(name)
    users_chunks += [df_out_i]
    _os.remove(name)

users = _pd.concat(users_chunks)
users.head()
Out[10]:
user_name average_stars yelping_since review years_of_elite fans useful cool funny friends ... av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
user_id
l6BmjZMeQD3rDxWUbiAiow Rashmi 3.000000 2013-10-08 95 3 5 84 25 17 2374 ... NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN
4XChL029mKr5hydo79Ljxg Jenna 3.500000 2013-02-21 33 0 4 48 16 22 27646 ... NaN NaN NaN NaN 4.0 3.501823 NaN 4.0 NaN NaN
bc8C_eETBWL0olvFSJJd0w David 3.384615 2013-10-04 16 0 0 28 10 8 358 ... 3.399765 NaN 2.0 NaN 5.0 3.123598 NaN 3.0 3.632007 1.997225
dD0gZpBctWGdWo9WlGuhlA Angela 5.000000 2014-05-22 17 0 5 30 14 4 12598 ... NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.000000
MM4RJAeH6yuaN8oZDSt0RA Nancy 4.400000 2013-10-23 361 4 39 1114 665 279 5542 ... NaN NaN NaN NaN NaN 4.450279 NaN 5.0 3.000000 5.000000

5 rows × 45 columns

In [11]:
users = users.reset_index()
users.to_pickle('../dataset/m2_n9/users_2.pickle')
In [12]:
users.shape
Out[12]:
(1148098, 46)
In [13]:
users_pre = _pd.read_pickle("../dataset/m2_n9/users.pickle")
users_pre.shape
Out[13]:
(1148098, 16)
In [4]:
len(grouped_reviews)
Out[4]:
954447
In [7]:
print("expected diff:", users.shape[0]-len(grouped_reviews))
expected diff: 193651
In [8]:
users_tmp = users[['av_rat_chinese_cuisine', 'av_rat_japanese_cuisine', 'av_rat_mexican_cuisine', 'av_rat_italian_cuisine', 
            'av_rat_others_cuisine', 'av_rat_american_cuisine', 'av_rat_korean_cuisine', 'av_rat_mediterranean_cuisine',
            'av_rat_thai_cuisine', 'av_rat_asianfusion_cuisine',
           
           'av_rat_chinese_cuisine_bin', 'av_rat_japanese_cuisine_bin', 'av_rat_mexican_cuisine_bin', 
           'av_rat_italian_cuisine_bin', 'av_rat_others_cuisine_bin', 'av_rat_american_cuisine_bin', 
           'av_rat_korean_cuisine_bin', 'av_rat_mediterranean_cuisine_bin', 'av_rat_thai_cuisine_bin', 
           'av_rat_asianfusion_cuisine_bin',
           
           'av_rat_chinese_cuisine_real', 'av_rat_japanese_cuisine_real', 'av_rat_mexican_cuisine_real', 
           'av_rat_italian_cuisine_real', 'av_rat_others_cuisine_real', 'av_rat_american_cuisine_real', 
           'av_rat_korean_cuisine_real', 'av_rat_mediterranean_cuisine_real', 'av_rat_thai_cuisine_real', 
           'av_rat_asianfusion_cuisine_real']]

count_na = 0
for i, r in users_tmp.iterrows():
        if r.isna().all():
            count_na += 1

print("actual diff:", count_na)
actual diff: 193651
In [4]:
_del_all()

3.2. Average of the ratings given by a certain user to the categories of a certain restaurant.

3.2.1 Test set
In [3]:
review_test = _pd.read_pickle('../dataset/m2_n9/review_test.pickle')
review_test = review_test.sort_values(by=['review_id'])
review_test = review_test.reset_index(drop = True)
review_test.shape
Out[3]:
(153993, 10)
In [4]:
review_test.head()
Out[4]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score
0 ---j05qHS2X7FkXjjMKKtA E6Aoz-3s4avfweIjziHjbA cTbFJzHQzFSX-z3JF4abKQ 5 1 0 1 2018-11-02 15:01:11 -1 0.229833
1 --4GjusuUCMh24c_oh_cEg YOOsYiXGEtGFX_wSeZNcww JytR7WvKyytDQNwOHUzSEg 4 1 0 0 2018-10-13 00:07:17 1 0.987487
2 --4RpVT5wHJ9AfnZkIC3tw Bdw4E8jFVd6-CbhrNAJ_EA -CfFjcCcGGDM9MVH_d42RQ 5 0 0 0 2018-10-14 04:41:42 -1 0.524751
3 --4vJzoC0m5h-yodXv-qCw jm4a1GghQ4zLCN3lQGMQUQ XMPBg6r_LqZhy9Cf-4ZJrA 2 0 0 0 2018-11-01 19:11:50 1 0.979875
4 --B9JxEb5gY5gAgD2BRhDQ x_6VdQU3CIdakwHod-dNzA Fn_IxcCtZl1EoS81sq_s9w 3 1 0 0 2018-10-31 02:01:04 1 0.662471
In [5]:
restaurants = _pd.read_pickle('../dataset/m2_n9/restaurants.pickle')
restaurants = restaurants.reset_index(drop = True)
restaurants = restaurants[['cuisine', 'business_id']]
restaurants.head()
Out[5]:
cuisine business_id
0 Chinese QXAEGFB4oINsVuTFxEYKFQ
1 Japanese gnKjwL_1w79qoiV3IC_xQQ
2 Mexican 1Dfx3zM-rW4n-31KeC8sJg
3 Italian fweCYi8FmbJXHCqLnwuk8w
4 Italian PZ-LZzSlhSe9utkQYU8pFg
In [6]:
review_test_rest = review_test.join(restaurants.set_index('business_id'), on = 'business_id')
review_test_rest.to_pickle('../dataset/m2_n9/review_test_cuisine.pickle')
review_test_rest.shape
Out[6]:
(153993, 11)
In [7]:
review_test_rest.head()
Out[7]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine
0 ---j05qHS2X7FkXjjMKKtA E6Aoz-3s4avfweIjziHjbA cTbFJzHQzFSX-z3JF4abKQ 5 1 0 1 2018-11-02 15:01:11 -1 0.229833 American
1 --4GjusuUCMh24c_oh_cEg YOOsYiXGEtGFX_wSeZNcww JytR7WvKyytDQNwOHUzSEg 4 1 0 0 2018-10-13 00:07:17 1 0.987487 Others
2 --4RpVT5wHJ9AfnZkIC3tw Bdw4E8jFVd6-CbhrNAJ_EA -CfFjcCcGGDM9MVH_d42RQ 5 0 0 0 2018-10-14 04:41:42 -1 0.524751 Mediterranean
3 --4vJzoC0m5h-yodXv-qCw jm4a1GghQ4zLCN3lQGMQUQ XMPBg6r_LqZhy9Cf-4ZJrA 2 0 0 0 2018-11-01 19:11:50 1 0.979875 Others
4 --B9JxEb5gY5gAgD2BRhDQ x_6VdQU3CIdakwHod-dNzA Fn_IxcCtZl1EoS81sq_s9w 3 1 0 0 2018-10-31 02:01:04 1 0.662471 Italian
In [8]:
del restaurants

users = _pd.read_pickle('../dataset/m2_n9/users_2.pickle')
users.head()
Out[8]:
user_id user_name average_stars yelping_since review years_of_elite fans useful cool funny friends num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 l6BmjZMeQD3rDxWUbiAiow Rashmi 3.000000 2013-10-08 95 3 5 84 25 17 2374 2.0 3.000000 2.0 3.017247 1.957334 NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN
1 4XChL029mKr5hydo79Ljxg Jenna 3.500000 2013-02-21 33 0 4 48 16 22 27646 12.0 3.777778 9.0 3.626019 9.280533 NaN NaN NaN NaN 4.0 3.333333 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.666667 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.501823 NaN 4.0 NaN NaN
2 bc8C_eETBWL0olvFSJJd0w David 3.384615 2013-10-04 16 0 0 28 10 8 358 13.0 3.500000 12.0 3.326528 10.350275 3.5 NaN 2.0 NaN 5.0 3.000000 NaN 3.0 3.666667 2.0 3.5 NaN 2.0 NaN 5.0 3.500000 NaN 3.0 3.666667 2.0 3.399765 NaN 2.0 NaN 5.0 3.123598 NaN 3.0 3.632007 1.997225
3 dD0gZpBctWGdWo9WlGuhlA Angela 5.000000 2014-05-22 17 0 5 30 14 4 12598 1.0 NaN 0.0 5.000000 0.329341 NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.0 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.000000
4 MM4RJAeH6yuaN8oZDSt0RA Nancy 4.400000 2013-10-23 361 4 39 1114 665 279 5542 5.0 4.400000 5.0 4.378799 4.720870 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.450279 NaN 5.0 3.000000 5.000000
In [9]:
users = users[['user_id', 'av_rat_chinese_cuisine', 'av_rat_japanese_cuisine', 'av_rat_mexican_cuisine', 'av_rat_italian_cuisine', 
            'av_rat_others_cuisine', 'av_rat_american_cuisine', 'av_rat_korean_cuisine', 'av_rat_mediterranean_cuisine',
            'av_rat_thai_cuisine', 'av_rat_asianfusion_cuisine',
           
           'av_rat_chinese_cuisine_bin', 'av_rat_japanese_cuisine_bin', 'av_rat_mexican_cuisine_bin', 
           'av_rat_italian_cuisine_bin', 'av_rat_others_cuisine_bin', 'av_rat_american_cuisine_bin', 
           'av_rat_korean_cuisine_bin', 'av_rat_mediterranean_cuisine_bin', 'av_rat_thai_cuisine_bin', 
           'av_rat_asianfusion_cuisine_bin',
           
           'av_rat_chinese_cuisine_real', 'av_rat_japanese_cuisine_real', 'av_rat_mexican_cuisine_real', 
           'av_rat_italian_cuisine_real', 'av_rat_others_cuisine_real', 'av_rat_american_cuisine_real', 
           'av_rat_korean_cuisine_real', 'av_rat_mediterranean_cuisine_real', 'av_rat_thai_cuisine_real', 
           'av_rat_asianfusion_cuisine_real']]

users.head()
Out[9]:
user_id av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 l6BmjZMeQD3rDxWUbiAiow NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN
1 4XChL029mKr5hydo79Ljxg NaN NaN NaN NaN 4.0 3.333333 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.666667 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.501823 NaN 4.0 NaN NaN
2 bc8C_eETBWL0olvFSJJd0w 3.5 NaN 2.0 NaN 5.0 3.000000 NaN 3.0 3.666667 2.0 3.5 NaN 2.0 NaN 5.0 3.500000 NaN 3.0 3.666667 2.0 3.399765 NaN 2.0 NaN 5.0 3.123598 NaN 3.0 3.632007 1.997225
3 dD0gZpBctWGdWo9WlGuhlA NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.0 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.000000
4 MM4RJAeH6yuaN8oZDSt0RA NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.450279 NaN 5.0 3.000000 5.000000
In [10]:
test_join = review_test_rest.join(users.set_index('user_id'), on = 'user_id', lsuffix = '_test_revirew', rsuffix = '_users')
test_join.shape
Out[10]:
(153993, 41)
In [11]:
test_join.head()
Out[11]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 ---j05qHS2X7FkXjjMKKtA E6Aoz-3s4avfweIjziHjbA cTbFJzHQzFSX-z3JF4abKQ 5 1 0 1 2018-11-02 15:01:11 -1 0.229833 American NaN 4.00 NaN 4.500 3.444444 4.000000 NaN NaN NaN 4.0 NaN 4.00 NaN 4.500 3.444444 4.285714 NaN NaN NaN 4.0 NaN 4.000000 NaN 4.508384 3.446572 3.969038 NaN NaN NaN 4.102936
1 --4GjusuUCMh24c_oh_cEg YOOsYiXGEtGFX_wSeZNcww JytR7WvKyytDQNwOHUzSEg 4 1 0 0 2018-10-13 00:07:17 1 0.987487 Others 3.75 3.75 4.0 3.625 3.604167 3.230769 2.0 4.4 3.666667 3.0 3.75 3.75 4.666667 3.625 3.607143 3.333333 2.0 4.25 3.666667 3.0 3.756843 3.726028 4.287215 3.525072 3.610109 3.170261 2.0 4.355007 3.664197 3.021022
2 --4RpVT5wHJ9AfnZkIC3tw Bdw4E8jFVd6-CbhrNAJ_EA -CfFjcCcGGDM9MVH_d42RQ 5 0 0 0 2018-10-14 04:41:42 -1 0.524751 Mediterranean NaN NaN 1.0 NaN 5.000000 5.000000 NaN 5.0 NaN NaN NaN NaN 1.000000 NaN 5.000000 NaN NaN NaN NaN NaN NaN NaN 1.000000 NaN 5.000000 5.000000 NaN 5.000000 NaN NaN
3 --4vJzoC0m5h-yodXv-qCw jm4a1GghQ4zLCN3lQGMQUQ XMPBg6r_LqZhy9Cf-4ZJrA 2 0 0 0 2018-11-01 19:11:50 1 0.979875 Others NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
4 --B9JxEb5gY5gAgD2BRhDQ x_6VdQU3CIdakwHod-dNzA Fn_IxcCtZl1EoS81sq_s9w 3 1 0 0 2018-10-31 02:01:04 1 0.662471 Italian NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
In [12]:
test_join.to_pickle('../dataset/m2_n9/join_test_users_review.pickle')
del users, review_test_rest
In [13]:
def _restaturants_users_cuisine_ratings(grouped):
    cuisines = str(grouped['cuisine']).split(", ")
    
    d = {'review_id' : grouped['review_id'],'cuisine_av_hist' : 0, 'cuisine_av_hist_bin' : 0, 'cuisine_av_hist_real': 0}
    index = ['review_id', 'cuisine_av_hist', 'cuisine_av_hist_bin', 'cuisine_av_hist_real']
   
    values = []
    for cuisine in cuisines:
        cui = cuisine.lower().replace(" ", "")
        name = "av_rat_{0}_cuisine".format(cui)
        values.append(grouped[name])
    d['cuisine_av_hist'] = _np.average(values)
    
    values = []
    for cuisine in cuisines:
        cui = cuisine.lower().replace(" ", "")
        name = "av_rat_{0}_cuisine_bin".format(cui)
        values.append(grouped[name])
    d['cuisine_av_hist_bin'] = _np.average(values)
    
    values = []
    for cuisine in cuisines:
        cui = cuisine.lower().replace(" ", "")
        name = "av_rat_{0}_cuisine_real".format(cui)
        values.append(grouped[name])
    d['cuisine_av_hist_real'] = _np.average(values)
    
    return _pd.Series(d, index = index)
In [14]:
applied_test = test_join.apply(_restaturants_users_cuisine_ratings, axis = 1)
applied_test.shape
Out[14]:
(153993, 4)
In [15]:
applied_test = applied_test.sort_values(by=['review_id'])
applied_test = applied_test.reset_index(drop = True)
applied_test.head()
Out[15]:
review_id cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real
0 ---j05qHS2X7FkXjjMKKtA 4.000000 4.285714 3.969038
1 --4GjusuUCMh24c_oh_cEg 3.604167 3.607143 3.610109
2 --4RpVT5wHJ9AfnZkIC3tw 5.000000 NaN 5.000000
3 --4vJzoC0m5h-yodXv-qCw NaN NaN NaN
4 --B9JxEb5gY5gAgD2BRhDQ NaN NaN NaN
In [16]:
applied_test.to_pickle('../dataset/m2_n9/applied_test_users_review.pickle')
In [17]:
review_test.shape
Out[17]:
(153993, 10)
In [18]:
review_test.head()
Out[18]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score
0 ---j05qHS2X7FkXjjMKKtA E6Aoz-3s4avfweIjziHjbA cTbFJzHQzFSX-z3JF4abKQ 5 1 0 1 2018-11-02 15:01:11 -1 0.229833
1 --4GjusuUCMh24c_oh_cEg YOOsYiXGEtGFX_wSeZNcww JytR7WvKyytDQNwOHUzSEg 4 1 0 0 2018-10-13 00:07:17 1 0.987487
2 --4RpVT5wHJ9AfnZkIC3tw Bdw4E8jFVd6-CbhrNAJ_EA -CfFjcCcGGDM9MVH_d42RQ 5 0 0 0 2018-10-14 04:41:42 -1 0.524751
3 --4vJzoC0m5h-yodXv-qCw jm4a1GghQ4zLCN3lQGMQUQ XMPBg6r_LqZhy9Cf-4ZJrA 2 0 0 0 2018-11-01 19:11:50 1 0.979875
4 --B9JxEb5gY5gAgD2BRhDQ x_6VdQU3CIdakwHod-dNzA Fn_IxcCtZl1EoS81sq_s9w 3 1 0 0 2018-10-31 02:01:04 1 0.662471
In [19]:
review_test = review_test.assign(cuisine_av_hist = applied_test['cuisine_av_hist'],
                                 cuisine_av_hist_bin = applied_test['cuisine_av_hist_bin'],
                                 cuisine_av_hist_real = applied_test['cuisine_av_hist_real'])
review_test.shape
Out[19]:
(153993, 13)
In [20]:
review_test.head()
Out[20]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real
0 ---j05qHS2X7FkXjjMKKtA E6Aoz-3s4avfweIjziHjbA cTbFJzHQzFSX-z3JF4abKQ 5 1 0 1 2018-11-02 15:01:11 -1 0.229833 4.000000 4.285714 3.969038
1 --4GjusuUCMh24c_oh_cEg YOOsYiXGEtGFX_wSeZNcww JytR7WvKyytDQNwOHUzSEg 4 1 0 0 2018-10-13 00:07:17 1 0.987487 3.604167 3.607143 3.610109
2 --4RpVT5wHJ9AfnZkIC3tw Bdw4E8jFVd6-CbhrNAJ_EA -CfFjcCcGGDM9MVH_d42RQ 5 0 0 0 2018-10-14 04:41:42 -1 0.524751 5.000000 NaN 5.000000
3 --4vJzoC0m5h-yodXv-qCw jm4a1GghQ4zLCN3lQGMQUQ XMPBg6r_LqZhy9Cf-4ZJrA 2 0 0 0 2018-11-01 19:11:50 1 0.979875 NaN NaN NaN
4 --B9JxEb5gY5gAgD2BRhDQ x_6VdQU3CIdakwHod-dNzA Fn_IxcCtZl1EoS81sq_s9w 3 1 0 0 2018-10-31 02:01:04 1 0.662471 NaN NaN NaN
In [21]:
test_set = review_test
test_set.to_pickle('../dataset/m2_n9/review_test_cuisine_final.pickle')
_del_all()
3.2.2 Training set
In [22]:
review_train = _pd.read_pickle('../dataset/m2_n9/review_train.pickle')
review_train = review_train.sort_values(by=['review_id'])
review_train = review_train.reset_index(drop = True)
review_train.shape
Out[22]:
(558386, 10)
In [23]:
review_train.head()
Out[23]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score
0 ---HLAnHbuLi7vd5TL6uYg zyp8SaRnZ94sWZpLrifS1Q l6xZVTEtdZAvNpL1JhYGuw 4 0 0 0 2018-05-26 13:49:33 -1 0.595406
1 ---L4b6VR6HoB-q7cfMWIA 697iJkhX1mkVF9RNhn114Q XiXu6WHbDoopKpeg7DfKdQ 5 3 1 2 2018-04-15 18:54:27 1 0.993384
2 ---sPYSgArT4Sd5v1nDVMQ iVSuN8PrtKVtLzhNiu23uA OumGHdbdp7WgyYMhcAdjhw 1 0 0 0 2018-07-07 15:16:09 1 0.519254
3 --0SzSMXVUoAXfackNoB4g v9P7J6hWWtIblnylQ5UBfA iCQpiavjjPzJ5_3gPD5Ebg 5 0 0 0 2018-05-09 13:59:37 1 0.983368
4 --1JMhPk6K9fZo4FOp_yMw 2xZ1mHP14as5RJ1KOrVU4A QJatAcxYgK1Zp9BRZMAx7g 2 0 0 0 2018-06-16 17:00:42 1 0.866956
In [24]:
restaurants = _pd.read_pickle('../dataset/m2_n9/restaurants.pickle')
restaurants = restaurants.reset_index(drop = True)
restaurants = restaurants[['cuisine', 'business_id']]
restaurants.head()
Out[24]:
cuisine business_id
0 Chinese QXAEGFB4oINsVuTFxEYKFQ
1 Japanese gnKjwL_1w79qoiV3IC_xQQ
2 Mexican 1Dfx3zM-rW4n-31KeC8sJg
3 Italian fweCYi8FmbJXHCqLnwuk8w
4 Italian PZ-LZzSlhSe9utkQYU8pFg
In [25]:
review_train_rest = review_train.join(restaurants.set_index('business_id'), on = 'business_id')
review_train_rest.to_pickle('../dataset/m2_n9/review_train_cuisine.pickle')
review_train_rest.shape
Out[25]:
(558386, 11)
In [26]:
review_train_rest.head()
Out[26]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine
0 ---HLAnHbuLi7vd5TL6uYg zyp8SaRnZ94sWZpLrifS1Q l6xZVTEtdZAvNpL1JhYGuw 4 0 0 0 2018-05-26 13:49:33 -1 0.595406 Italian, Mediterranean
1 ---L4b6VR6HoB-q7cfMWIA 697iJkhX1mkVF9RNhn114Q XiXu6WHbDoopKpeg7DfKdQ 5 3 1 2 2018-04-15 18:54:27 1 0.993384 Others
2 ---sPYSgArT4Sd5v1nDVMQ iVSuN8PrtKVtLzhNiu23uA OumGHdbdp7WgyYMhcAdjhw 1 0 0 0 2018-07-07 15:16:09 1 0.519254 American
3 --0SzSMXVUoAXfackNoB4g v9P7J6hWWtIblnylQ5UBfA iCQpiavjjPzJ5_3gPD5Ebg 5 0 0 0 2018-05-09 13:59:37 1 0.983368 Others
4 --1JMhPk6K9fZo4FOp_yMw 2xZ1mHP14as5RJ1KOrVU4A QJatAcxYgK1Zp9BRZMAx7g 2 0 0 0 2018-06-16 17:00:42 1 0.866956 American
In [27]:
del restaurants

users = _pd.read_pickle('../dataset/m2_n9/users_2.pickle')
users.head()
Out[27]:
user_id user_name average_stars yelping_since review years_of_elite fans useful cool funny friends num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 l6BmjZMeQD3rDxWUbiAiow Rashmi 3.000000 2013-10-08 95 3 5 84 25 17 2374 2.0 3.000000 2.0 3.017247 1.957334 NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN
1 4XChL029mKr5hydo79Ljxg Jenna 3.500000 2013-02-21 33 0 4 48 16 22 27646 12.0 3.777778 9.0 3.626019 9.280533 NaN NaN NaN NaN 4.0 3.333333 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.666667 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.501823 NaN 4.0 NaN NaN
2 bc8C_eETBWL0olvFSJJd0w David 3.384615 2013-10-04 16 0 0 28 10 8 358 13.0 3.500000 12.0 3.326528 10.350275 3.5 NaN 2.0 NaN 5.0 3.000000 NaN 3.0 3.666667 2.0 3.5 NaN 2.0 NaN 5.0 3.500000 NaN 3.0 3.666667 2.0 3.399765 NaN 2.0 NaN 5.0 3.123598 NaN 3.0 3.632007 1.997225
3 dD0gZpBctWGdWo9WlGuhlA Angela 5.000000 2014-05-22 17 0 5 30 14 4 12598 1.0 NaN 0.0 5.000000 0.329341 NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.0 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.000000
4 MM4RJAeH6yuaN8oZDSt0RA Nancy 4.400000 2013-10-23 361 4 39 1114 665 279 5542 5.0 4.400000 5.0 4.378799 4.720870 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.450279 NaN 5.0 3.000000 5.000000
In [28]:
users = users[['user_id', 'av_rat_chinese_cuisine', 'av_rat_japanese_cuisine', 'av_rat_mexican_cuisine', 'av_rat_italian_cuisine', 
            'av_rat_others_cuisine', 'av_rat_american_cuisine', 'av_rat_korean_cuisine', 'av_rat_mediterranean_cuisine',
            'av_rat_thai_cuisine', 'av_rat_asianfusion_cuisine',
           
           'av_rat_chinese_cuisine_bin', 'av_rat_japanese_cuisine_bin', 'av_rat_mexican_cuisine_bin', 
           'av_rat_italian_cuisine_bin', 'av_rat_others_cuisine_bin', 'av_rat_american_cuisine_bin', 
           'av_rat_korean_cuisine_bin', 'av_rat_mediterranean_cuisine_bin', 'av_rat_thai_cuisine_bin', 
           'av_rat_asianfusion_cuisine_bin',
           
           'av_rat_chinese_cuisine_real', 'av_rat_japanese_cuisine_real', 'av_rat_mexican_cuisine_real', 
           'av_rat_italian_cuisine_real', 'av_rat_others_cuisine_real', 'av_rat_american_cuisine_real', 
           'av_rat_korean_cuisine_real', 'av_rat_mediterranean_cuisine_real', 'av_rat_thai_cuisine_real', 
           'av_rat_asianfusion_cuisine_real']]

users.head()
Out[28]:
user_id av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 l6BmjZMeQD3rDxWUbiAiow NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN
1 4XChL029mKr5hydo79Ljxg NaN NaN NaN NaN 4.0 3.333333 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.666667 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.501823 NaN 4.0 NaN NaN
2 bc8C_eETBWL0olvFSJJd0w 3.5 NaN 2.0 NaN 5.0 3.000000 NaN 3.0 3.666667 2.0 3.5 NaN 2.0 NaN 5.0 3.500000 NaN 3.0 3.666667 2.0 3.399765 NaN 2.0 NaN 5.0 3.123598 NaN 3.0 3.632007 1.997225
3 dD0gZpBctWGdWo9WlGuhlA NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.0 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.000000
4 MM4RJAeH6yuaN8oZDSt0RA NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.450279 NaN 5.0 3.000000 5.000000
In [29]:
train_join = review_train_rest.join(users.set_index('user_id'), on = 'user_id', lsuffix = '_train_revirew', rsuffix = '_users')
train_join.shape
Out[29]:
(558386, 41)
In [30]:
train_join.head()
Out[30]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 ---HLAnHbuLi7vd5TL6uYg zyp8SaRnZ94sWZpLrifS1Q l6xZVTEtdZAvNpL1JhYGuw 4 0 0 0 2018-05-26 13:49:33 -1 0.595406 Italian, Mediterranean NaN NaN NaN NaN 3.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN 3.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN 3.022921 NaN NaN NaN NaN NaN
1 ---L4b6VR6HoB-q7cfMWIA 697iJkhX1mkVF9RNhn114Q XiXu6WHbDoopKpeg7DfKdQ 5 3 1 2 2018-04-15 18:54:27 1 0.993384 Others 3.833333 2.875 4.0 3.25 3.013889 3.125 3.333333 3.4 2.666667 2.666667 3.823529 2.6 4.0 3.25 3.014925 3.066667 3.333333 3.4 2.4 2.666667 3.821347 2.726907 4.0 3.303025 3.039444 3.145065 2.847918 3.387395 2.572254 2.752899
2 ---sPYSgArT4Sd5v1nDVMQ iVSuN8PrtKVtLzhNiu23uA OumGHdbdp7WgyYMhcAdjhw 1 0 0 0 2018-07-07 15:16:09 1 0.519254 American NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
3 --0SzSMXVUoAXfackNoB4g v9P7J6hWWtIblnylQ5UBfA iCQpiavjjPzJ5_3gPD5Ebg 5 0 0 0 2018-05-09 13:59:37 1 0.983368 Others NaN NaN NaN 1.00 1.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN 1.000000 NaN NaN NaN NaN NaN NaN NaN NaN 1.000000 1.000000 NaN NaN NaN NaN NaN
4 --1JMhPk6K9fZo4FOp_yMw 2xZ1mHP14as5RJ1KOrVU4A QJatAcxYgK1Zp9BRZMAx7g 2 0 0 0 2018-06-16 17:00:42 1 0.866956 American NaN NaN 4.0 4.50 5.000000 2.750 NaN NaN NaN NaN NaN NaN 4.0 4.50 5.000000 2.666667 NaN NaN NaN NaN NaN NaN 4.0 4.507847 5.000000 2.748879 NaN NaN NaN NaN
In [31]:
train_join.to_pickle('../dataset/m2_n9/join_train_users_review.pickle')
del users, review_train_rest
In [32]:
applied_train = train_join.apply(_restaturants_users_cuisine_ratings, axis = 1)
applied_train.shape
Out[32]:
(558386, 4)
In [33]:
applied_train = applied_train.sort_values(by=['review_id'])
applied_train = applied_train.reset_index(drop = True)
applied_train.head()
Out[33]:
review_id cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real
0 ---HLAnHbuLi7vd5TL6uYg NaN NaN NaN
1 ---L4b6VR6HoB-q7cfMWIA 3.013889 3.014925 3.039444
2 ---sPYSgArT4Sd5v1nDVMQ NaN NaN NaN
3 --0SzSMXVUoAXfackNoB4g 1.000000 1.000000 1.000000
4 --1JMhPk6K9fZo4FOp_yMw 2.750000 2.666667 2.748879
In [34]:
applied_train.to_pickle('../dataset/m2_n9/applied_train_users_review.pickle')
In [35]:
review_train.shape
Out[35]:
(558386, 10)
In [36]:
review_train.head()
Out[36]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score
0 ---HLAnHbuLi7vd5TL6uYg zyp8SaRnZ94sWZpLrifS1Q l6xZVTEtdZAvNpL1JhYGuw 4 0 0 0 2018-05-26 13:49:33 -1 0.595406
1 ---L4b6VR6HoB-q7cfMWIA 697iJkhX1mkVF9RNhn114Q XiXu6WHbDoopKpeg7DfKdQ 5 3 1 2 2018-04-15 18:54:27 1 0.993384
2 ---sPYSgArT4Sd5v1nDVMQ iVSuN8PrtKVtLzhNiu23uA OumGHdbdp7WgyYMhcAdjhw 1 0 0 0 2018-07-07 15:16:09 1 0.519254
3 --0SzSMXVUoAXfackNoB4g v9P7J6hWWtIblnylQ5UBfA iCQpiavjjPzJ5_3gPD5Ebg 5 0 0 0 2018-05-09 13:59:37 1 0.983368
4 --1JMhPk6K9fZo4FOp_yMw 2xZ1mHP14as5RJ1KOrVU4A QJatAcxYgK1Zp9BRZMAx7g 2 0 0 0 2018-06-16 17:00:42 1 0.866956
In [37]:
review_train = review_train.assign(cuisine_av_hist = applied_train['cuisine_av_hist'],
                                   cuisine_av_hist_bin = applied_train['cuisine_av_hist_bin'],
                                   cuisine_av_hist_real = applied_train['cuisine_av_hist_real'])
review_train.shape
Out[37]:
(558386, 13)
In [38]:
review_train.head()
Out[38]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real
0 ---HLAnHbuLi7vd5TL6uYg zyp8SaRnZ94sWZpLrifS1Q l6xZVTEtdZAvNpL1JhYGuw 4 0 0 0 2018-05-26 13:49:33 -1 0.595406 NaN NaN NaN
1 ---L4b6VR6HoB-q7cfMWIA 697iJkhX1mkVF9RNhn114Q XiXu6WHbDoopKpeg7DfKdQ 5 3 1 2 2018-04-15 18:54:27 1 0.993384 3.013889 3.014925 3.039444
2 ---sPYSgArT4Sd5v1nDVMQ iVSuN8PrtKVtLzhNiu23uA OumGHdbdp7WgyYMhcAdjhw 1 0 0 0 2018-07-07 15:16:09 1 0.519254 NaN NaN NaN
3 --0SzSMXVUoAXfackNoB4g v9P7J6hWWtIblnylQ5UBfA iCQpiavjjPzJ5_3gPD5Ebg 5 0 0 0 2018-05-09 13:59:37 1 0.983368 1.000000 1.000000 1.000000
4 --1JMhPk6K9fZo4FOp_yMw 2xZ1mHP14as5RJ1KOrVU4A QJatAcxYgK1Zp9BRZMAx7g 2 0 0 0 2018-06-16 17:00:42 1 0.866956 2.750000 2.666667 2.748879
In [39]:
train_set = review_train
train_set.to_pickle('../dataset/m2_n9/review_train_cuisine_final.pickle')
_del_all()

4. User-based collaborative approach

$pred(u, r) = a_u + \frac{\sum_{u_i \in U} sim(u, u_i) * a_{u_i, r} - a_r} {\sum_{u_i \in U} sim(u, u_i)}$

In [3]:
users = _pd.read_pickle('../dataset/m2_n9/users_2.pickle')
users.set_index('user_id', inplace=True)
users.shape
Out[3]:
(1148098, 45)
In [4]:
from multiproc_utils import cols_std, cols_bin, cols_real

users = users[[*cols_std, *cols_bin, *cols_real]]
users.shape
Out[4]:
(1148098, 30)
In [5]:
users = users.fillna(users.mean())
In [6]:
restaurants = _pd.read_pickle('../dataset/m2_n9/restaurants.pickle')
restaurants.set_index('business_id', inplace=True)
In [7]:
review_hist = _pd.read_pickle('../dataset/m2_n9/review_hist.pickle')
review_hist.shape
Out[7]:
(3489305, 10)

Training set

In [8]:
review_train = _pd.read_pickle('../dataset/m2_n9/review_train_cuisine_final.pickle')
review_train.set_index('review_id', inplace=True)
review_train.sort_values('business_id', inplace=True)
review_train.shape
Out[8]:
(558386, 12)
In [9]:
review_train = review_train.assign(coll_score=_np.nan, coll_score_bin=_np.nan, coll_score_real=_np.nan)
review_train.shape
Out[9]:
(558386, 15)
In [10]:
# split review_train, review_hist, restaurants and users datasets into n_cores parts, where n_cores is the number of available processors
n_cores = _os.cpu_count()

review_train_splits = _np.array_split(review_train, n_cores)   # list of input dataframes (from review_train)
review_split_names = []   # list of paths of review_train dataframes
user_splits = []
restaurant_splits = []
review_hist_splits = []

for i, df in enumerate(review_train_splits):
    name = "../dataset/m2_n9/tmp/review_train_split_" + str(i) + ".pickle"
    review_split_names += [name]
    this_review_hist = review_hist.loc[review_hist.business_id.isin(review_hist.business_id.unique())]
    review_hist_splits += [this_review_hist]
    user_ids = set(df.user_id) | set(this_review_hist.user_id)
    user_splits += [users.loc[user_ids]]
    restaurant_splits += [restaurants.loc[df.business_id.unique()]]
In [11]:
from multiproc_utils import set_coll_scores

if __name__ ==  '__main__':
    with _Pool(processes=n_cores) as p:
        p.map(set_coll_scores, zip(review_train_splits, review_split_names, review_hist_splits, user_splits, restaurant_splits))
In [12]:
review_chunks = []

# add chunks produced by subprocesses
for name in review_split_names:
    review_chunks += [_pd.read_pickle(name)]
    _os.remove(name)

review_train = _pd.concat(review_chunks)
review_train.shape
Out[12]:
(558386, 15)
In [13]:
review_train.reset_index(inplace=True)
review_train.head()
Out[13]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-01-11 19:55:31 1 0.622302 NaN NaN NaN NaN NaN NaN
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-02-25 17:47:12 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 2018-05-06 04:22:48 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 2018-04-22 17:42:09 1 0.988395 NaN NaN NaN NaN NaN NaN
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 2018-05-21 05:09:07 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633
In [14]:
review_train.to_pickle('../dataset/m2_n9/review_train.pickle')
del review_train

Test set

In [15]:
review_test = _pd.read_pickle('../dataset/m2_n9/review_test_cuisine_final.pickle')
review_test.set_index('review_id', inplace=True)
review_test.shape
Out[15]:
(153993, 12)
In [16]:
review_test = review_test.assign(coll_score=_np.nan, coll_score_bin=_np.nan, coll_score_real=_np.nan)
review_test.shape
Out[16]:
(153993, 15)
In [17]:
# split review_test, review_hist, restaurants and users datasets into n_cores parts, where n_cores is the number of available processors

review_test.sort_values('business_id', inplace=True)
review_test_splits = _np.array_split(review_test, n_cores)   # list of input dataframes (from review_test)
review_split_names = []   # list of paths of review_test dataframes
user_splits = []
restaurant_splits = []
review_hist_splits = []

for i, df in enumerate(review_test_splits):
    name = "../dataset/m2_n9/tmp/review_test_split_" + str(i) + ".pickle"
    review_split_names += [name]
    this_review_hist = review_hist.loc[review_hist.business_id.isin(review_hist.business_id.unique())]
    review_hist_splits += [this_review_hist]
    user_ids = set(df.user_id) | set(this_review_hist.user_id)
    user_splits += [users.loc[user_ids]]
    restaurant_splits += [restaurants.loc[df.business_id.unique()]]
In [18]:
from multiproc_utils import set_coll_scores

if __name__ ==  '__main__':
    with _Pool(processes=n_cores) as p:
        p.map(set_coll_scores, zip(review_test_splits, review_split_names, review_hist_splits, user_splits, restaurant_splits))
In [19]:
review_chunks = []

# add chunks produced by subprocesses
for name in review_split_names:
    review_chunks += [_pd.read_pickle(name)]
    _os.remove(name)

review_test = _pd.concat(review_chunks)
review_test.shape
Out[19]:
(153993, 15)
In [20]:
review_test.reset_index(inplace=True)
review_test.head()
Out[20]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 2018-10-21 18:45:39 1 0.997555 NaN NaN NaN NaN NaN NaN
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-21 01:07:38 -1 0.553523 1.8 2.000000 1.799679 1.838975 2.007105 1.777964
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-09 03:20:03 1 0.990602 4.3 4.333333 4.299574 4.349620 4.302949 4.288981
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 2018-10-04 01:37:05 1 0.968214 NaN NaN NaN NaN NaN NaN
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 2018-11-06 19:48:01 1 0.995667 NaN NaN NaN NaN NaN NaN
In [21]:
review_test.to_pickle('../dataset/m2_n9/review_test.pickle')
_del_all()

5. Some more preprocessing

  • We don't need the dataset checkin, and from the dataset tips we take only the feature "compliments";
  • The train set is a join of all the data needed for training;
  • The test set is a join of all the data needed for predictions and performance evaluation (labels included);
  • The label is a feature 'likes' that is 1 if that user will like that restaurant (4 or 5 stars) or 0 if he/she won't like that restaurant (1, 2 or 3 stars).

5.1 Training set

In [22]:
review_train = _pd.read_pickle('../dataset/m2_n9/review_train.pickle')
review_train = review_train.assign(likes = _np.nan)
review_train['likes'] = _np.where(review_train['stars'].isin([4, 5]), 1, 0)
review_train.head(10)
Out[22]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-01-11 19:55:31 1 0.622302 NaN NaN NaN NaN NaN NaN 1
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-02-25 17:47:12 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 2018-05-06 04:22:48 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 2018-04-22 17:42:09 1 0.988395 NaN NaN NaN NaN NaN NaN 0
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 2018-05-21 05:09:07 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0
5 rPNt-m2pdt-OR_NMQnHjCQ tKIihU81IA3NjpsADuR-Tg --6MefnULPED_I942VcFNA 5 3 0 2 2018-07-08 19:29:51 1 0.907921 4.400000 4.400000 4.395327 4.376569 4.405801 4.313458 1
6 Kg582pH05mZO_E6WS8PrKA XNOs3Wz1Q_zdRgm1Hy05fg --6MefnULPED_I942VcFNA 1 2 2 1 2018-08-14 21:59:34 -1 0.184327 NaN NaN NaN NaN NaN NaN 0
7 gkW6_UqV9b2XI_5ae8rBCg HSHuSCJvIvf_Tof62uZPEw --6MefnULPED_I942VcFNA 2 2 1 0 2018-08-19 04:01:19 -1 0.687126 1.800000 1.692308 1.768813 1.816353 1.733574 1.724494 0
8 02voOwsYf0cEdKNzt5IkwA yvpX68yurPsope6KhBZrYA --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 2018-02-19 03:29:10 1 0.935335 NaN NaN NaN NaN NaN NaN 1
9 M67I-I5ATaqtVLtKZTgygw gvh8bvei5vwfoIYbNIvNDQ --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 2018-06-24 18:41:08 -1 0.242671 3.000000 3.000000 3.040677 3.007972 3.009842 3.062459 1
In [23]:
restaurants = _pd.read_pickle('../dataset/m2_n9/restaurants.pickle')
restaurants = restaurants.reset_index(drop = True)
restaurants.head()
Out[23]:
business_id name address cuisine postal_code latitude longitude review_count stars OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
0 QXAEGFB4oINsVuTFxEYKFQ Emerald Chinese Restaurant 30 Eglinton Avenue W Chinese L5R 3E7 43.605499 -79.652289 128 2.5 False NaN False True No Full_Bar Specialty Food, Restaurants, Dim Sum, Imported... Mississauga 09:00:00 09:00:00 09:00:00 09:00:00 09:00:00 09:00:00 09:00:00 00:00:00 00:00:00 00:00:00 00:00:00 01:00:00 01:00:00 00:00:00 2.726496 117.0 2.718750 96.0 2.730197 95.873087
1 gnKjwL_1w79qoiV3IC_xQQ Musashi Japanese Restaurant 10110 Johnston Rd, Ste 15 Japanese 28210 35.092564 -80.859132 170 4.0 False True False True No Beer&Wine Sushi Bars, Restaurants, Japanese Charlotte 17:30:00 NaT 17:30:00 17:30:00 17:30:00 17:30:00 17:30:00 21:30:00 NaT 21:30:00 21:30:00 22:00:00 22:00:00 21:00:00 4.063291 158.0 4.094203 138.0 4.067541 139.112078
2 1Dfx3zM-rW4n-31KeC8sJg Taco Bell 2450 E Indian School Rd Mexican 85016 33.495194 -112.028588 18 3.0 False True False False No No Restaurants, Breakfast & Brunch, Mexican, Taco... Phoenix 07:00:00 07:00:00 07:00:00 07:00:00 07:00:00 07:00:00 07:00:00 00:00:00 00:00:00 00:00:00 01:00:00 01:00:00 01:00:00 00:00:00 3.125000 16.0 2.769231 13.0 2.847327 12.604125
3 fweCYi8FmbJXHCqLnwuk8w Marco's Pizza 5981 Andrews Rd Italian 44060 41.708520 -81.359556 16 4.0 False True True False NaN No Italian, Restaurants, Pizza, Chicken Wings Mentor-on-the-Lake 10:00:00 10:00:00 10:00:00 10:00:00 10:00:00 10:00:00 10:00:00 00:00:00 00:00:00 00:00:00 00:00:00 01:00:00 01:00:00 00:00:00 4.230769 13.0 4.166667 12.0 4.142021 10.965903
4 PZ-LZzSlhSe9utkQYU8pFg Carluccio's Tivoli Gardens 1775 E Tropicana Ave, Ste 29 Italian 89119 36.100016 -115.128529 40 4.0 False True False True No Full_Bar Restaurants, Italian Las Vegas NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT 4.097561 41.0 4.212121 33.0 4.167159 33.655622
In [24]:
review_rest_train = review_train.join(restaurants.set_index('business_id'), on = 'business_id', lsuffix = '_review', rsuffix = '_restaurant')
review_rest_train.head()
Out[24]:
review_id user_id business_id stars_review useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-01-11 19:55:31 1 0.622302 NaN NaN NaN NaN NaN NaN 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-02-25 17:47:12 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 2018-05-06 04:22:48 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 2018-04-22 17:42:09 1 0.988395 NaN NaN NaN NaN NaN NaN 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 2018-05-21 05:09:07 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759
In [25]:
print(len(review_train))
print(len(review_rest_train))
558386
558386
In [26]:
tips = _pd.read_pickle('../dataset/m2_n9/tips_train.pickle')
tips = tips.reset_index(drop = True)
tips.head()
Out[26]:
user_id business_id tips_date compliment_count
0 Fzz-0v1yHLaWuTV64b1miA EZZjaiV8ik05NUepqdeP2A 2018-02-14 0
1 AuSB69SSnaPNTwwbFk93MQ bAHDSbpJE3kKJkW9OBgOyw 2018-03-23 0
2 ouk36OGbx25nO23b10L5jw THO77IL6DLob9Agt9QCjsw 2018-03-22 0
3 gwmyGLz4eBm9QiBU_Ze2KQ RJOFGZZf3ho04ku0fcFRdA 2018-03-29 0
4 BzcdTNAe_jtXfnXFdFYQsA _pBXtjN43eqMV0XZTz7nmw 2018-04-02 0
In [27]:
tips_agg = tips.groupby(['business_id', 'user_id'])['compliment_count'].agg(_np.sum)
tips_agg.head()
Out[27]:
business_id             user_id               
--6MefnULPED_I942VcFNA  EisUuXVeVJN_FcFiE-tqwA    0
--7zmmkVg-IMGaXbuVd0SQ  9LlkZJ7NPsFSFMnIih8X1w    0
--9e1ONYQuAa-CB_Rrw7Tw  2J4PhasBxLtIv-kiS3_FiA    0
                        G2ZiNXL4rZdSxzaDSAAODQ    0
                        GFSZXppCJaO4oiqdgknWnA    0
Name: compliment_count, dtype: int64
In [28]:
review_tip_train = review_rest_train.join(tips_agg, on=['business_id', 'user_id'], lsuffix = '_review', rsuffix = '_tip')
review_tip_train.head()
Out[28]:
review_id user_id business_id stars_review useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real compliment_count
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-01-11 19:55:31 1 0.622302 NaN NaN NaN NaN NaN NaN 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-02-25 17:47:12 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 2018-05-06 04:22:48 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 2018-04-22 17:42:09 1 0.988395 NaN NaN NaN NaN NaN NaN 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 2018-05-21 05:09:07 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.0
In [29]:
print(len(review_train))
print(len(review_tip_train))
558386
558386
In [30]:
users = _pd.read_pickle('../dataset/m2_n9/users_2.pickle')
users = users.reset_index(drop = True)
users.head()
Out[30]:
user_id user_name average_stars yelping_since review years_of_elite fans useful cool funny friends num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 l6BmjZMeQD3rDxWUbiAiow Rashmi 3.000000 2013-10-08 95 3 5 84 25 17 2374 2.0 3.000000 2.0 3.017247 1.957334 NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN
1 4XChL029mKr5hydo79Ljxg Jenna 3.500000 2013-02-21 33 0 4 48 16 22 27646 12.0 3.777778 9.0 3.626019 9.280533 NaN NaN NaN NaN 4.0 3.333333 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.666667 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.501823 NaN 4.0 NaN NaN
2 bc8C_eETBWL0olvFSJJd0w David 3.384615 2013-10-04 16 0 0 28 10 8 358 13.0 3.500000 12.0 3.326528 10.350275 3.5 NaN 2.0 NaN 5.0 3.000000 NaN 3.0 3.666667 2.0 3.5 NaN 2.0 NaN 5.0 3.500000 NaN 3.0 3.666667 2.0 3.399765 NaN 2.0 NaN 5.0 3.123598 NaN 3.0 3.632007 1.997225
3 dD0gZpBctWGdWo9WlGuhlA Angela 5.000000 2014-05-22 17 0 5 30 14 4 12598 1.0 NaN 0.0 5.000000 0.329341 NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.0 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.000000
4 MM4RJAeH6yuaN8oZDSt0RA Nancy 4.400000 2013-10-23 361 4 39 1114 665 279 5542 5.0 4.400000 5.0 4.378799 4.720870 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.450279 NaN 5.0 3.000000 5.000000
In [31]:
train_set = review_tip_train.join(users.set_index('user_id'), on = 'user_id', lsuffix = '_review', rsuffix = '_user')
del review_rest_train, users
train_set.head()
Out[31]:
review_id user_id business_id stars_review useful_review funny_review cool_review date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count user_name average_stars_user yelping_since review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-01-11 19:55:31 1 0.622302 NaN NaN NaN NaN NaN NaN 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Jexie 3.846154 2009-06-24 18 0 1 8 4 4 4 13.0 4.222222 9.0 3.798692 11.168551 NaN 5.0 NaN NaN 3.857143 NaN 3.0 5.0 NaN 2.00 NaN 5.0 NaN NaN 5.000000 NaN 3.00 5.0 NaN 2.000000 NaN 5.0 NaN NaN 3.926822 NaN 2.866472 5.000000 NaN 2.000000
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-02-25 17:47:12 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Alex 3.333333 2015-07-27 18 0 0 1 1 1 454 3.0 3.333333 3.0 3.231620 2.702970 NaN NaN 4.0 NaN 5.000000 2.5 NaN NaN NaN NaN NaN NaN 4.0 NaN 5.000000 2.5 NaN NaN NaN NaN NaN NaN 4.000000 NaN 5.000000 2.353400 NaN NaN NaN NaN
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 2018-05-06 04:22:48 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Mermaid 2.571429 2012-01-01 27 0 0 31 3 3 46 14.0 2.692308 13.0 2.692835 12.704916 2.500000 1.0 3.0 NaN 3.200000 NaN NaN 1.0 2.333333 NaN 2.500000 1.0 3.0 NaN 3.200000 NaN NaN 1.0 3.0 NaN 2.505665 1.0 2.990709 NaN 3.204491 NaN NaN 1.000000 2.974502 NaN
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 2018-04-22 17:42:09 1 0.988395 NaN NaN NaN NaN NaN NaN 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Jen 3.703313 2013-02-27 2 0 0 1 0 0 622 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 2018-05-21 05:09:07 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.0 Alex 2.980769 2008-11-15 212 3 23 307 157 71 2902 104.0 3.000000 88.0 2.997169 88.020993 2.966667 4.0 NaN 2.0 3.040000 2.6 3.6 2.5 1.000000 3.75 2.923077 4.0 NaN 1.666667 3.146341 2.6 3.75 2.5 1.0 3.666667 2.954478 4.0 NaN 1.826533 3.115334 2.630356 3.621347 2.500236 1.000000 3.828739
In [32]:
print(len(review_train))
print(len(train_set))
558386
558386
In [33]:
train_set.to_pickle('../dataset/m2_n9/model_train_set.pickle')
_del_all()

5.2 Test set

In [34]:
review_test = _pd.read_pickle('../dataset/m2_n9/review_test.pickle')
review_test = review_test.assign(likes = _np.nan)
review_test['likes'] = _np.where(review_test['stars'].isin([4, 5]), 1, 0)
review_test.head()
Out[34]:
review_id user_id business_id stars useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 2018-10-21 18:45:39 1 0.997555 NaN NaN NaN NaN NaN NaN 1
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-21 01:07:38 -1 0.553523 1.8 2.000000 1.799679 1.838975 2.007105 1.777964 1
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-09 03:20:03 1 0.990602 4.3 4.333333 4.299574 4.349620 4.302949 4.288981 1
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 2018-10-04 01:37:05 1 0.968214 NaN NaN NaN NaN NaN NaN 0
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 2018-11-06 19:48:01 1 0.995667 NaN NaN NaN NaN NaN NaN 1
In [35]:
print(len(review_test))
153993
In [36]:
restaurants = _pd.read_pickle('../dataset/m2_n9/restaurants.pickle')
restaurants = restaurants.reset_index(drop = True)
restaurants.head()
Out[36]:
business_id name address cuisine postal_code latitude longitude review_count stars OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
0 QXAEGFB4oINsVuTFxEYKFQ Emerald Chinese Restaurant 30 Eglinton Avenue W Chinese L5R 3E7 43.605499 -79.652289 128 2.5 False NaN False True No Full_Bar Specialty Food, Restaurants, Dim Sum, Imported... Mississauga 09:00:00 09:00:00 09:00:00 09:00:00 09:00:00 09:00:00 09:00:00 00:00:00 00:00:00 00:00:00 00:00:00 01:00:00 01:00:00 00:00:00 2.726496 117.0 2.718750 96.0 2.730197 95.873087
1 gnKjwL_1w79qoiV3IC_xQQ Musashi Japanese Restaurant 10110 Johnston Rd, Ste 15 Japanese 28210 35.092564 -80.859132 170 4.0 False True False True No Beer&Wine Sushi Bars, Restaurants, Japanese Charlotte 17:30:00 NaT 17:30:00 17:30:00 17:30:00 17:30:00 17:30:00 21:30:00 NaT 21:30:00 21:30:00 22:00:00 22:00:00 21:00:00 4.063291 158.0 4.094203 138.0 4.067541 139.112078
2 1Dfx3zM-rW4n-31KeC8sJg Taco Bell 2450 E Indian School Rd Mexican 85016 33.495194 -112.028588 18 3.0 False True False False No No Restaurants, Breakfast & Brunch, Mexican, Taco... Phoenix 07:00:00 07:00:00 07:00:00 07:00:00 07:00:00 07:00:00 07:00:00 00:00:00 00:00:00 00:00:00 01:00:00 01:00:00 01:00:00 00:00:00 3.125000 16.0 2.769231 13.0 2.847327 12.604125
3 fweCYi8FmbJXHCqLnwuk8w Marco's Pizza 5981 Andrews Rd Italian 44060 41.708520 -81.359556 16 4.0 False True True False NaN No Italian, Restaurants, Pizza, Chicken Wings Mentor-on-the-Lake 10:00:00 10:00:00 10:00:00 10:00:00 10:00:00 10:00:00 10:00:00 00:00:00 00:00:00 00:00:00 00:00:00 01:00:00 01:00:00 00:00:00 4.230769 13.0 4.166667 12.0 4.142021 10.965903
4 PZ-LZzSlhSe9utkQYU8pFg Carluccio's Tivoli Gardens 1775 E Tropicana Ave, Ste 29 Italian 89119 36.100016 -115.128529 40 4.0 False True False True No Full_Bar Restaurants, Italian Las Vegas NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT NaT 4.097561 41.0 4.212121 33.0 4.167159 33.655622
In [37]:
review_rest_test = review_test.join(restaurants.set_index('business_id'), on = 'business_id', lsuffix = '_review', rsuffix = '_restaurant')
del restaurants
review_rest_test.head()
Out[37]:
review_id user_id business_id stars_review useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 2018-10-21 18:45:39 1 0.997555 NaN NaN NaN NaN NaN NaN 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-21 01:07:38 -1 0.553523 1.8 2.000000 1.799679 1.838975 2.007105 1.777964 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-09 03:20:03 1 0.990602 4.3 4.333333 4.299574 4.349620 4.302949 4.288981 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 2018-10-04 01:37:05 1 0.968214 NaN NaN NaN NaN NaN NaN 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 2018-11-06 19:48:01 1 0.995667 NaN NaN NaN NaN NaN NaN 1 Delmonico Steakhouse 3355 Las Vegas Blvd S Others 89109 36.123183 -115.169190 1613 4.0 False True False True No Full_Bar Cajun/Creole, Seafood, Steakhouses, Restaurants Las Vegas 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 22:00:00 22:00:00 22:00:00 22:00:00 22:30:00 22:30:00 22:00:00 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732
In [38]:
print(len(review_test))
print(len(review_rest_test))
153993
153993
In [39]:
tips = _pd.read_pickle('../dataset/m2_n9/tips_test.pickle')
tips = tips.reset_index(drop = True)
tips.head()
Out[39]:
user_id business_id tips_date compliment_count
0 Yr0B0aVb94i2oIm1gLXgfg yGDiAVoQB8LX3OsJ4e2I0A 2018-10-05 0
1 z6qIyc-_oIbgDWAembZ04w wFO5HLn-GSfYCSPPcbyqoA 2018-10-12 0
2 EyYVD9n7PlYYLTSEQ5t14w yhDAzBBjFujZbHwBPfE2eQ 2018-10-15 0
3 90fhUaWIY6ctVAX5jQ2GNQ ZMJOURno1xJS7PG3ZhfeyQ 2018-10-20 0
4 k7J0CjxFoxdSayhlDE-k7w X3qrbOkhdCjm0NTBX7T80Q 2018-10-30 0
In [40]:
tips_agg = tips.groupby(['business_id', 'user_id'])['compliment_count'].agg(_np.sum)
tips_agg.head()
Out[40]:
business_id             user_id               
--9e1ONYQuAa-CB_Rrw7Tw  4vj_0BQeXjCyNB7ESS5mGg    0
                        idlz5ohzqTX5NnOZrxzcsQ    0
                        x_Nu7oNHf4VHwqgV3qLpfg    0
-01XupAWZEXbdNbxNg5mEg  8bt-F30D_tW-ajPG-IpZSA    0
-0DET7VdEQOJVJ_v6klEug  G-6X-llgA_qAxGxocykHzQ    0
Name: compliment_count, dtype: int64
In [41]:
review_tip_test = review_rest_test.join(tips_agg, on=['business_id', 'user_id'], lsuffix = '_review', rsuffix = '_tip')
review_tip_test.head()
Out[41]:
review_id user_id business_id stars_review useful funny cool date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real compliment_count
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 2018-10-21 18:45:39 1 0.997555 NaN NaN NaN NaN NaN NaN 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-21 01:07:38 -1 0.553523 1.8 2.000000 1.799679 1.838975 2.007105 1.777964 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-09 03:20:03 1 0.990602 4.3 4.333333 4.299574 4.349620 4.302949 4.288981 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 2018-10-04 01:37:05 1 0.968214 NaN NaN NaN NaN NaN NaN 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 2018-11-06 19:48:01 1 0.995667 NaN NaN NaN NaN NaN NaN 1 Delmonico Steakhouse 3355 Las Vegas Blvd S Others 89109 36.123183 -115.169190 1613 4.0 False True False True No Full_Bar Cajun/Creole, Seafood, Steakhouses, Restaurants Las Vegas 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 22:00:00 22:00:00 22:00:00 22:00:00 22:30:00 22:30:00 22:00:00 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 NaN
In [42]:
print(len(review_test))
print(len(review_tip_test))
153993
153993
In [43]:
users = _pd.read_pickle('../dataset/m2_n9/users_2.pickle')
users = users.reset_index(drop = True)
users.head()
Out[43]:
user_id user_name average_stars yelping_since review years_of_elite fans useful cool funny friends num_reviews average_stars_bin num_reviews_bin average_stars_real num_reviews_real av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 l6BmjZMeQD3rDxWUbiAiow Rashmi 3.000000 2013-10-08 95 3 5 84 25 17 2374 2.0 3.000000 2.0 3.017247 1.957334 NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN NaN NaN NaN 4.0 2.0 4.000000 NaN NaN NaN NaN
1 4XChL029mKr5hydo79Ljxg Jenna 3.500000 2013-02-21 33 0 4 48 16 22 27646 12.0 3.777778 9.0 3.626019 9.280533 NaN NaN NaN NaN 4.0 3.333333 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.666667 NaN 4.0 NaN NaN NaN NaN NaN NaN 4.0 3.501823 NaN 4.0 NaN NaN
2 bc8C_eETBWL0olvFSJJd0w David 3.384615 2013-10-04 16 0 0 28 10 8 358 13.0 3.500000 12.0 3.326528 10.350275 3.5 NaN 2.0 NaN 5.0 3.000000 NaN 3.0 3.666667 2.0 3.5 NaN 2.0 NaN 5.0 3.500000 NaN 3.0 3.666667 2.0 3.399765 NaN 2.0 NaN 5.0 3.123598 NaN 3.0 3.632007 1.997225
3 dD0gZpBctWGdWo9WlGuhlA Angela 5.000000 2014-05-22 17 0 5 30 14 4 12598 1.0 NaN 0.0 5.000000 0.329341 NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.0 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 5.000000
4 MM4RJAeH6yuaN8oZDSt0RA Nancy 4.400000 2013-10-23 361 4 39 1114 665 279 5542 5.0 4.400000 5.0 4.378799 4.720870 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.500000 NaN 5.0 3.000000 5.0 NaN NaN NaN NaN NaN 4.450279 NaN 5.0 3.000000 5.000000
In [44]:
test_set = review_tip_test.join(users.set_index('user_id'), on = 'user_id', lsuffix = '_review', rsuffix = '_user')
del review_rest_test, users
test_set.head()
Out[44]:
review_id user_id business_id stars_review useful_review funny_review cool_review date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count user_name average_stars_user yelping_since review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 2018-10-21 18:45:39 1 0.997555 NaN NaN NaN NaN NaN NaN 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Fran 3.703313 2017-01-09 67 1 1 16 6 2 3574 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-21 01:07:38 -1 0.553523 1.8 2.000000 1.799679 1.838975 2.007105 1.777964 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Leung 2.875000 2016-07-20 86 0 1 34 9 12 166 16.0 3.000000 15.0 2.909282 13.563981 1.8 2.20 NaN 4.5 3.333333 1.000000 3.0 NaN NaN NaN 2.000000 2.50 NaN 4.5 3.333333 1.0 3.0 NaN NaN NaN 1.799679 2.460018 NaN 4.571695 3.355656 1.000000 2.924220 NaN NaN NaN
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-09 03:20:03 1 0.990602 4.3 4.333333 4.299574 4.349620 4.302949 4.288981 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Jo 4.108108 2017-08-08 227 2 7 99 47 30 286 37.0 4.161290 31.0 4.130733 31.326167 4.3 3.75 NaN 3.5 4.454545 3.666667 3.8 NaN 4.0 4.0 4.333333 3.75 NaN 3.0 4.555556 3.5 4.0 NaN 4.0 4.0 4.299574 3.724926 NaN 3.340936 4.538601 3.626374 3.946442 NaN 4.0 4.0
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 2018-10-04 01:37:05 1 0.968214 NaN NaN NaN NaN NaN NaN 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Cindy 3.703313 2016-03-09 1 0 0 0 0 0 2110 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 2018-11-06 19:48:01 1 0.995667 NaN NaN NaN NaN NaN NaN 1 Delmonico Steakhouse 3355 Las Vegas Blvd S Others 89109 36.123183 -115.169190 1613 4.0 False True False True No Full_Bar Cajun/Creole, Seafood, Steakhouses, Restaurants Las Vegas 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 22:00:00 22:00:00 22:00:00 22:00:00 22:30:00 22:30:00 22:00:00 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 NaN Katie 3.703313 2017-06-15 59 0 0 5 0 1 22 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
In [45]:
print(len(review_test))
print(len(test_set))
153993
153993
In [46]:
test_set.to_pickle('../dataset/m2_n9/model_test_set.pickle')
_del_all()

5.3 Prepare data for the models

We have to fill missing values in the dataset, and then convert non-numerical features into numerical features, or drop them if they are not necessary for our models, so that the remaining features are readable by our models.

We execute two versions of this step, in order to obtain two different dataset: a complete one and a lighter one. The second one is used for grid search, that duplicates the input dataset to work in parallel and reduce the computation time to a reasonable amount.

5.3.1 Complete version

We summarize what kind of data we have at the moment, in order to decide what to do with each feature.

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set.pickle')
train_set.head()
Out[3]:
review_id user_id business_id stars_review useful_review funny_review cool_review date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count user_name average_stars_user yelping_since review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-01-11 19:55:31 1 0.622302 NaN NaN NaN NaN NaN NaN 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Jexie 3.846154 2009-06-24 18 0 1 8 4 4 4 13.0 4.222222 9.0 3.798692 11.168551 NaN 5.0 NaN NaN 3.857143 NaN 3.0 5.0 NaN 2.00 NaN 5.0 NaN NaN 5.000000 NaN 3.00 5.0 NaN 2.000000 NaN 5.0 NaN NaN 3.926822 NaN 2.866472 5.000000 NaN 2.000000
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-02-25 17:47:12 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Alex 3.333333 2015-07-27 18 0 0 1 1 1 454 3.0 3.333333 3.0 3.231620 2.702970 NaN NaN 4.0 NaN 5.000000 2.5 NaN NaN NaN NaN NaN NaN 4.0 NaN 5.000000 2.5 NaN NaN NaN NaN NaN NaN 4.000000 NaN 5.000000 2.353400 NaN NaN NaN NaN
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 2018-05-06 04:22:48 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Mermaid 2.571429 2012-01-01 27 0 0 31 3 3 46 14.0 2.692308 13.0 2.692835 12.704916 2.500000 1.0 3.0 NaN 3.200000 NaN NaN 1.0 2.333333 NaN 2.500000 1.0 3.0 NaN 3.200000 NaN NaN 1.0 3.0 NaN 2.505665 1.0 2.990709 NaN 3.204491 NaN NaN 1.000000 2.974502 NaN
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 2018-04-22 17:42:09 1 0.988395 NaN NaN NaN NaN NaN NaN 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Jen 3.703313 2013-02-27 2 0 0 1 0 0 622 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 2018-05-21 05:09:07 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.0 Alex 2.980769 2008-11-15 212 3 23 307 157 71 2902 104.0 3.000000 88.0 2.997169 88.020993 2.966667 4.0 NaN 2.0 3.040000 2.6 3.6 2.5 1.000000 3.75 2.923077 4.0 NaN 1.666667 3.146341 2.6 3.75 2.5 1.0 3.666667 2.954478 4.0 NaN 1.826533 3.115334 2.630356 3.621347 2.500236 1.000000 3.828739
In [4]:
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set.pickle')
test_set.head()
Out[4]:
review_id user_id business_id stars_review useful_review funny_review cool_review date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count user_name average_stars_user yelping_since review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 2018-10-21 18:45:39 1 0.997555 NaN NaN NaN NaN NaN NaN 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Fran 3.703313 2017-01-09 67 1 1 16 6 2 3574 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-21 01:07:38 -1 0.553523 1.8 2.000000 1.799679 1.838975 2.007105 1.777964 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Leung 2.875000 2016-07-20 86 0 1 34 9 12 166 16.0 3.000000 15.0 2.909282 13.563981 1.8 2.20 NaN 4.5 3.333333 1.000000 3.0 NaN NaN NaN 2.000000 2.50 NaN 4.5 3.333333 1.0 3.0 NaN NaN NaN 1.799679 2.460018 NaN 4.571695 3.355656 1.000000 2.924220 NaN NaN NaN
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-09 03:20:03 1 0.990602 4.3 4.333333 4.299574 4.349620 4.302949 4.288981 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Jo 4.108108 2017-08-08 227 2 7 99 47 30 286 37.0 4.161290 31.0 4.130733 31.326167 4.3 3.75 NaN 3.5 4.454545 3.666667 3.8 NaN 4.0 4.0 4.333333 3.75 NaN 3.0 4.555556 3.5 4.0 NaN 4.0 4.0 4.299574 3.724926 NaN 3.340936 4.538601 3.626374 3.946442 NaN 4.0 4.0
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 2018-10-04 01:37:05 1 0.968214 NaN NaN NaN NaN NaN NaN 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Cindy 3.703313 2016-03-09 1 0 0 0 0 0 2110 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 2018-11-06 19:48:01 1 0.995667 NaN NaN NaN NaN NaN NaN 1 Delmonico Steakhouse 3355 Las Vegas Blvd S Others 89109 36.123183 -115.169190 1613 4.0 False True False True No Full_Bar Cajun/Creole, Seafood, Steakhouses, Restaurants Las Vegas 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 22:00:00 22:00:00 22:00:00 22:00:00 22:30:00 22:30:00 22:00:00 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 NaN Katie 3.703313 2017-06-15 59 0 0 5 0 1 22 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
In [5]:
train_test_set = _pd.concat([train_set, test_set], sort=False)
In [6]:
print("train size:", train_set.shape)
print("test size:", test_set.shape)
print("train_test size:", train_test_set.shape)
print(train_set.shape[0] + test_set.shape[0] == train_test_set.shape[0])
_train_len = train_set.shape[0]
train size: (558386, 99)
test size: (153993, 99)
train_test size: (712379, 99)
True
In [7]:
train_test_set.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 712379 entries, 0 to 153992
Data columns (total 99 columns):
review_id                            712379 non-null object
user_id                              712379 non-null object
business_id                          712379 non-null object
stars_review                         712379 non-null int64
useful_review                        712379 non-null int64
funny_review                         712379 non-null int64
cool_review                          712379 non-null int64
date                                 712379 non-null datetime64[ns]
bin_truth_score                      712379 non-null int64
real_truth_score                     712379 non-null float64
cuisine_av_hist                      269090 non-null float64
cuisine_av_hist_bin                  249642 non-null float64
cuisine_av_hist_real                 269090 non-null float64
coll_score                           269090 non-null float64
coll_score_bin                       249642 non-null float64
coll_score_real                      269090 non-null float64
likes                                712379 non-null int32
name                                 712379 non-null object
address                              712379 non-null object
cuisine                              712379 non-null object
postal_code                          712379 non-null object
latitude                             712379 non-null float64
longitude                            712379 non-null float64
review_count                         712379 non-null int64
stars_restaurant                     712379 non-null float64
OutdoorSeating                       669576 non-null object
BusinessAcceptsCreditCards           588620 non-null object
RestaurantsDelivery                  677719 non-null object
RestaurantsReservations              680010 non-null object
WiFi                                 658240 non-null object
Alcohol                              651883 non-null object
categories                           712379 non-null object
city                                 712379 non-null object
Monday_Open                          645277 non-null object
Tuesday_Open                         668918 non-null object
Wednesday_Open                       679256 non-null object
Thursday_Open                        683039 non-null object
Friday_Open                          683398 non-null object
Saturday_Open                        679080 non-null object
Sunday_Open                          626609 non-null object
Monday_Close                         645277 non-null object
Tuesday_Close                        668918 non-null object
Wednesday_Close                      679256 non-null object
Thursday_Close                       683039 non-null object
Friday_Close                         683398 non-null object
Saturday_Close                       679080 non-null object
Sunday_Close                         626609 non-null object
average_stars_review                 712379 non-null float64
num_reviews_review                   712379 non-null float64
average_stars_bin_review             711373 non-null float64
num_reviews_bin_review               712379 non-null float64
average_stars_real_review            712379 non-null float64
num_reviews_real_review              712379 non-null float64
compliment_count                     17732 non-null float64
user_name                            712379 non-null object
average_stars_user                   712379 non-null float64
yelping_since                        712379 non-null object
review                               712379 non-null int64
years_of_elite                       712379 non-null int64
fans                                 712379 non-null int64
useful_user                          712379 non-null int64
cool_user                            712379 non-null int64
funny_user                           712379 non-null int64
friends                              712379 non-null int64
num_reviews_user                     712379 non-null float64
average_stars_bin_user               694363 non-null float64
num_reviews_bin_user                 712379 non-null float64
average_stars_real_user              712379 non-null float64
num_reviews_real_user                712379 non-null float64
av_rat_chinese_cuisine               140210 non-null float64
av_rat_japanese_cuisine              149309 non-null float64
av_rat_mexican_cuisine               195524 non-null float64
av_rat_italian_cuisine               198832 non-null float64
av_rat_others_cuisine                305838 non-null float64
av_rat_american_cuisine              291851 non-null float64
av_rat_korean_cuisine                73297 non-null float64
av_rat_mediterranean_cuisine         122566 non-null float64
av_rat_thai_cuisine                  105389 non-null float64
av_rat_asianfusion_cuisine           166768 non-null float64
av_rat_chinese_cuisine_bin           127182 non-null float64
av_rat_japanese_cuisine_bin          135034 non-null float64
av_rat_mexican_cuisine_bin           175440 non-null float64
av_rat_italian_cuisine_bin           180287 non-null float64
av_rat_others_cuisine_bin            286214 non-null float64
av_rat_american_cuisine_bin          272580 non-null float64
av_rat_korean_cuisine_bin            66395 non-null float64
av_rat_mediterranean_cuisine_bin     108226 non-null float64
av_rat_thai_cuisine_bin              94816 non-null float64
av_rat_asianfusion_cuisine_bin       151573 non-null float64
av_rat_chinese_cuisine_real          140210 non-null float64
av_rat_japanese_cuisine_real         149309 non-null float64
av_rat_mexican_cuisine_real          195524 non-null float64
av_rat_italian_cuisine_real          198832 non-null float64
av_rat_others_cuisine_real           305838 non-null float64
av_rat_american_cuisine_real         291851 non-null float64
av_rat_korean_cuisine_real           73297 non-null float64
av_rat_mediterranean_cuisine_real    122566 non-null float64
av_rat_thai_cuisine_real             105389 non-null float64
av_rat_asianfusion_cuisine_real      166768 non-null float64
dtypes: datetime64[ns](1), float64(53), int32(1), int64(13), object(31)
memory usage: 540.8+ MB
In [8]:
train_test_types = train_test_set.dtypes
In [9]:
for ind, dtype in train_test_types.iteritems():
    if not _np.issubdtype(dtype, _np.number):
        if "id" not in ind:
            uniq_vals = train_test_set[ind].unique()
            null_vals = train_test_set[ind].isnull().sum()
            print(ind + " - " + str(dtype) + "  - unique: " + str(len(uniq_vals)) + " - nulls: " + str(null_vals))
            print(uniq_vals[:10])
            print()
date - datetime64[ns]  - unique: 699960 - nulls: 0
['2018-01-11T19:55:31.000000000' '2018-02-25T17:47:12.000000000'
 '2018-05-06T04:22:48.000000000' '2018-04-22T17:42:09.000000000'
 '2018-05-21T05:09:07.000000000' '2018-07-08T19:29:51.000000000'
 '2018-08-14T21:59:34.000000000' '2018-08-19T04:01:19.000000000'
 '2018-02-19T03:29:10.000000000' '2018-06-24T18:41:08.000000000']

name - object  - unique: 26090 - nulls: 0
['The Spicy Amigos' "John's Chinese BBQ Restaurant" 'Delmonico Steakhouse'
 'Sunnyside Grill' 'The Bar At Bermuda & St. Rose' 'Mm Mm Pizza' 'Sushiya'
 'Happy Moose Bar and Grill' "Denny's" 'Pio Pio']

address - object  - unique: 35750 - nulls: 0
['821 4 Avenue SW' '328 Highway 7 E, Chalmers Gate 11, Unit 10'
 '3355 Las Vegas Blvd S' '1218 Saint Clair Avenue W' '11624 Bermuda Rd'
 '407 S Central Ave' '1950 Chemin Fer Ã\xa0 Cheval' '9436 State Rte 14'
 '6207 Wilson Mills Rd' '1408 E Blvd']

cuisine - object  - unique: 87 - nulls: 0
['Mexican' 'Chinese' 'Others' 'American' 'Japanese' 'Italian'
 'Asian Fusion' 'Korean' 'Mexican, American' 'Thai']

postal_code - object  - unique: 8309 - nulls: 0
['T2P 0K5' 'L4B 3P7' '89109' 'M6E' '89052' '15317' 'J3E 2T6' '44241'
 '44143' '28203']

OutdoorSeating - object  - unique: 4 - nulls: 42803
['True' 'False' nan 'None']

BusinessAcceptsCreditCards - object  - unique: 4 - nulls: 123759
[nan 'True' 'False' 'None']

RestaurantsDelivery - object  - unique: 4 - nulls: 34660
['False' 'True' nan 'None']

RestaurantsReservations - object  - unique: 4 - nulls: 32369
[nan 'True' 'False' 'None']

WiFi - object  - unique: 4 - nulls: 54139
[nan 'No' 'Free' 'Paid']

Alcohol - object  - unique: 4 - nulls: 60496
['Beer&Wine' 'Full_Bar' 'No' nan]

categories - object  - unique: 22832 - nulls: 0
['Restaurants, Mexican' 'Chinese, Restaurants'
 'Cajun/Creole, Seafood, Steakhouses, Restaurants'
 'Restaurants, Breakfast & Brunch'
 'Nightlife, Beer, Wine & Spirits, Bars, Restaurants, American (New), Food'
 'Restaurants, Pizza, Chicken Wings, Salad'
 'Buffets, Restaurants, Japanese, Sushi Bars'
 'Nightlife, Sports Bars, Restaurants, Bars, American (Traditional)'
 'Breakfast & Brunch, American (Traditional), Restaurants, Diners'
 'Restaurants, Spanish, Peruvian, Colombian, Latin American']

city - object  - unique: 671 - nulls: 0
['Calgary' 'Richmond Hill' 'Las Vegas' 'Toronto' 'Henderson' 'Canonsburg'
 'Sainte-Julie' 'Streetsboro' 'Highland Heights' 'Charlotte']

Monday_Open - object  - unique: 59 - nulls: 67102
[datetime.time(11, 0) datetime.time(17, 0) datetime.time(7, 0)
 datetime.time(0, 0) datetime.time(16, 0) NaT datetime.time(12, 0)
 datetime.time(10, 0) datetime.time(6, 0) datetime.time(6, 30)]

Tuesday_Open - object  - unique: 56 - nulls: 43461
[datetime.time(11, 0) datetime.time(17, 0) datetime.time(7, 0)
 datetime.time(0, 0) datetime.time(16, 0) NaT datetime.time(12, 0)
 datetime.time(10, 0) datetime.time(7, 30) datetime.time(6, 0)]

Wednesday_Open - object  - unique: 60 - nulls: 33123
[datetime.time(11, 0) datetime.time(17, 0) datetime.time(7, 0)
 datetime.time(0, 0) datetime.time(16, 0) NaT datetime.time(12, 0)
 datetime.time(10, 0) datetime.time(7, 30) datetime.time(6, 0)]

Thursday_Open - object  - unique: 62 - nulls: 29340
[datetime.time(11, 0) datetime.time(17, 0) datetime.time(7, 0)
 datetime.time(0, 0) NaT datetime.time(12, 0) datetime.time(10, 0)
 datetime.time(7, 30) datetime.time(6, 0) datetime.time(15, 0)]

Saturday_Open - object  - unique: 59 - nulls: 33299
[datetime.time(11, 0) datetime.time(17, 0) datetime.time(7, 0)
 datetime.time(0, 0) datetime.time(16, 0) NaT datetime.time(12, 0)
 datetime.time(10, 0) datetime.time(7, 30) datetime.time(6, 0)]

Sunday_Open - object  - unique: 62 - nulls: 85770
[NaT datetime.time(11, 0) datetime.time(17, 0) datetime.time(7, 0)
 datetime.time(0, 0) datetime.time(4, 30) datetime.time(12, 0)
 datetime.time(10, 0) datetime.time(7, 30) datetime.time(6, 0)]

Monday_Close - object  - unique: 69 - nulls: 67102
[datetime.time(20, 0) datetime.time(22, 30) datetime.time(22, 0)
 datetime.time(15, 0) datetime.time(0, 0) datetime.time(23, 0)
 datetime.time(21, 0) datetime.time(2, 30) NaT datetime.time(21, 30)]

Tuesday_Close - object  - unique: 70 - nulls: 43461
[datetime.time(20, 0) datetime.time(22, 30) datetime.time(22, 0)
 datetime.time(15, 0) datetime.time(0, 0) datetime.time(23, 0)
 datetime.time(21, 0) datetime.time(2, 30) NaT datetime.time(1, 0)]

Wednesday_Close - object  - unique: 70 - nulls: 33123
[datetime.time(20, 0) datetime.time(22, 30) datetime.time(22, 0)
 datetime.time(15, 0) datetime.time(0, 0) datetime.time(23, 0)
 datetime.time(21, 0) datetime.time(2, 30) NaT datetime.time(1, 0)]

Thursday_Close - object  - unique: 77 - nulls: 29340
[datetime.time(20, 0) datetime.time(22, 30) datetime.time(22, 0)
 datetime.time(15, 0) datetime.time(0, 0) datetime.time(23, 0)
 datetime.time(21, 0) datetime.time(2, 30) NaT datetime.time(1, 0)]

Saturday_Close - object  - unique: 76 - nulls: 33299
[datetime.time(4, 0) datetime.time(22, 30) datetime.time(16, 0)
 datetime.time(0, 0) datetime.time(1, 0) datetime.time(22, 0)
 datetime.time(2, 30) NaT datetime.time(2, 0) datetime.time(21, 30)]

Sunday_Close - object  - unique: 71 - nulls: 85770
[NaT datetime.time(22, 30) datetime.time(22, 0) datetime.time(16, 0)
 datetime.time(0, 0) datetime.time(23, 0) datetime.time(21, 0)
 datetime.time(2, 30) datetime.time(21, 30) datetime.time(15, 0)]

user_name - object  - unique: 41141 - nulls: 0
['Jexie' 'Alex' 'Mermaid' 'Jen' 'Kit' 'Y' 'Belinda' 'Jim' 'Marko' 'Tien']

yelping_since - object  - unique: 4618 - nulls: 0
[datetime.date(2009, 6, 24) datetime.date(2015, 7, 27)
 datetime.date(2012, 1, 1) datetime.date(2013, 2, 27)
 datetime.date(2008, 11, 15) datetime.date(2011, 4, 16)
 datetime.date(2013, 6, 25) datetime.date(2011, 6, 7)
 datetime.date(2011, 6, 4) datetime.date(2016, 7, 5)]

Drop useless features

In [10]:
train_test_set.drop(columns=['date', 'name', 'address', 'yelping_since', 'user_name', 'cuisine'], inplace=True)

Fill missing values

In [11]:
train_test_set['OutdoorSeating'] = train_test_set['OutdoorSeating'].fillna('None')
train_test_set['BusinessAcceptsCreditCards'] = train_test_set['BusinessAcceptsCreditCards'].fillna('None')
train_test_set['RestaurantsDelivery'] = train_test_set['RestaurantsDelivery'].fillna('None')
train_test_set['RestaurantsReservations'] = train_test_set['RestaurantsReservations'].fillna('None')
train_test_set['WiFi'] = train_test_set['WiFi'].fillna('None')
train_test_set['Alcohol'] = train_test_set['Alcohol'].fillna('None')
In [12]:
train_test_set['Monday_Open'] = train_test_set["Monday_Open"].astype(str)
train_test_set['Monday_Open'] = train_test_set['Monday_Open'].fillna(train_test_set['Monday_Open'].mode())
train_test_set['Tuesday_Open'] = train_test_set["Tuesday_Open"].astype(str)
train_test_set['Tuesday_Open'] = train_test_set['Tuesday_Open'].fillna(train_test_set['Tuesday_Open'].mode())
train_test_set['Wednesday_Open'] = train_test_set["Wednesday_Open"].astype(str)
train_test_set['Wednesday_Open'] = train_test_set['Wednesday_Open'].fillna(train_test_set['Wednesday_Open'].mode())
train_test_set['Thursday_Open'] = train_test_set["Thursday_Open"].astype(str)
train_test_set['Thursday_Open'] = train_test_set['Thursday_Open'].fillna(train_test_set['Thursday_Open'].mode())
train_test_set['Friday_Open'] = train_test_set["Friday_Open"].astype(str)
train_test_set['Friday_Open'] = train_test_set['Friday_Open'].fillna(train_test_set['Friday_Open'].mode())
train_test_set['Saturday_Open'] = train_test_set["Saturday_Open"].astype(str)
train_test_set['Saturday_Open'] = train_test_set['Saturday_Open'].fillna(train_test_set['Saturday_Open'].mode())
train_test_set['Sunday_Open'] = train_test_set["Sunday_Open"].astype(str)
train_test_set['Sunday_Open'] = train_test_set['Sunday_Open'].fillna(train_test_set['Sunday_Open'].mode())
train_test_set['Monday_Close'] = train_test_set["Monday_Close"].astype(str)
train_test_set['Monday_Close'] = train_test_set['Monday_Close'].fillna(train_test_set['Monday_Close'].mode())
train_test_set['Tuesday_Close'] = train_test_set["Tuesday_Close"].astype(str)
train_test_set['Tuesday_Close'] = train_test_set['Tuesday_Close'].fillna(train_test_set['Tuesday_Close'].mode())
train_test_set['Wednesday_Close'] = train_test_set["Wednesday_Close"].astype(str)
train_test_set['Wednesday_Close'] = train_test_set['Wednesday_Close'].fillna(train_test_set['Wednesday_Close'].mode())
train_test_set['Thursday_Close'] = train_test_set["Thursday_Close"].astype(str)
train_test_set['Thursday_Close'] = train_test_set['Thursday_Close'].fillna(train_test_set['Thursday_Close'].mode())
train_test_set['Friday_Close'] = train_test_set["Friday_Close"].astype(str)
train_test_set['Friday_Close'] = train_test_set['Friday_Close'].fillna(train_test_set['Friday_Close'].mode())
train_test_set['Saturday_Close'] = train_test_set["Saturday_Close"].astype(str)
train_test_set['Saturday_Close'] = train_test_set['Saturday_Close'].fillna(train_test_set['Saturday_Close'].mode())
train_test_set['Sunday_Close'] = train_test_set["Sunday_Close"].astype(str)
train_test_set['Sunday_Close'] = train_test_set['Sunday_Close'].fillna(train_test_set['Sunday_Close'].mode())
In [13]:
for ind, dtype in train_test_types.iteritems():
    if _np.issubdtype(dtype, _np.floating):
        train_test_set[ind] = train_test_set[ind].fillna(train_test_set[ind].mean())
    elif _np.issubdtype(dtype, _np.integer):
        train_test_set[ind] = train_test_set[ind].fillna(round(train_test_set[ind].mean()))
In [14]:
# check if any feature still has null values
train_test_set.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 712379 entries, 0 to 153992
Data columns (total 93 columns):
review_id                            712379 non-null object
user_id                              712379 non-null object
business_id                          712379 non-null object
stars_review                         712379 non-null int64
useful_review                        712379 non-null int64
funny_review                         712379 non-null int64
cool_review                          712379 non-null int64
bin_truth_score                      712379 non-null int64
real_truth_score                     712379 non-null float64
cuisine_av_hist                      712379 non-null float64
cuisine_av_hist_bin                  712379 non-null float64
cuisine_av_hist_real                 712379 non-null float64
coll_score                           712379 non-null float64
coll_score_bin                       712379 non-null float64
coll_score_real                      712379 non-null float64
likes                                712379 non-null int32
postal_code                          712379 non-null object
latitude                             712379 non-null float64
longitude                            712379 non-null float64
review_count                         712379 non-null int64
stars_restaurant                     712379 non-null float64
OutdoorSeating                       712379 non-null object
BusinessAcceptsCreditCards           712379 non-null object
RestaurantsDelivery                  712379 non-null object
RestaurantsReservations              712379 non-null object
WiFi                                 712379 non-null object
Alcohol                              712379 non-null object
categories                           712379 non-null object
city                                 712379 non-null object
Monday_Open                          712379 non-null object
Tuesday_Open                         712379 non-null object
Wednesday_Open                       712379 non-null object
Thursday_Open                        712379 non-null object
Friday_Open                          712379 non-null object
Saturday_Open                        712379 non-null object
Sunday_Open                          712379 non-null object
Monday_Close                         712379 non-null object
Tuesday_Close                        712379 non-null object
Wednesday_Close                      712379 non-null object
Thursday_Close                       712379 non-null object
Friday_Close                         712379 non-null object
Saturday_Close                       712379 non-null object
Sunday_Close                         712379 non-null object
average_stars_review                 712379 non-null float64
num_reviews_review                   712379 non-null float64
average_stars_bin_review             712379 non-null float64
num_reviews_bin_review               712379 non-null float64
average_stars_real_review            712379 non-null float64
num_reviews_real_review              712379 non-null float64
compliment_count                     712379 non-null float64
average_stars_user                   712379 non-null float64
review                               712379 non-null int64
years_of_elite                       712379 non-null int64
fans                                 712379 non-null int64
useful_user                          712379 non-null int64
cool_user                            712379 non-null int64
funny_user                           712379 non-null int64
friends                              712379 non-null int64
num_reviews_user                     712379 non-null float64
average_stars_bin_user               712379 non-null float64
num_reviews_bin_user                 712379 non-null float64
average_stars_real_user              712379 non-null float64
num_reviews_real_user                712379 non-null float64
av_rat_chinese_cuisine               712379 non-null float64
av_rat_japanese_cuisine              712379 non-null float64
av_rat_mexican_cuisine               712379 non-null float64
av_rat_italian_cuisine               712379 non-null float64
av_rat_others_cuisine                712379 non-null float64
av_rat_american_cuisine              712379 non-null float64
av_rat_korean_cuisine                712379 non-null float64
av_rat_mediterranean_cuisine         712379 non-null float64
av_rat_thai_cuisine                  712379 non-null float64
av_rat_asianfusion_cuisine           712379 non-null float64
av_rat_chinese_cuisine_bin           712379 non-null float64
av_rat_japanese_cuisine_bin          712379 non-null float64
av_rat_mexican_cuisine_bin           712379 non-null float64
av_rat_italian_cuisine_bin           712379 non-null float64
av_rat_others_cuisine_bin            712379 non-null float64
av_rat_american_cuisine_bin          712379 non-null float64
av_rat_korean_cuisine_bin            712379 non-null float64
av_rat_mediterranean_cuisine_bin     712379 non-null float64
av_rat_thai_cuisine_bin              712379 non-null float64
av_rat_asianfusion_cuisine_bin       712379 non-null float64
av_rat_chinese_cuisine_real          712379 non-null float64
av_rat_japanese_cuisine_real         712379 non-null float64
av_rat_mexican_cuisine_real          712379 non-null float64
av_rat_italian_cuisine_real          712379 non-null float64
av_rat_others_cuisine_real           712379 non-null float64
av_rat_american_cuisine_real         712379 non-null float64
av_rat_korean_cuisine_real           712379 non-null float64
av_rat_mediterranean_cuisine_real    712379 non-null float64
av_rat_thai_cuisine_real             712379 non-null float64
av_rat_asianfusion_cuisine_real      712379 non-null float64
dtypes: float64(53), int32(1), int64(13), object(26)
memory usage: 508.2+ MB

Convert non-numerical features

In [15]:
train_test_set.shape
Out[15]:
(712379, 93)
In [16]:
cat_cols = ['OutdoorSeating', 'BusinessAcceptsCreditCards', 'RestaurantsDelivery', 'RestaurantsReservations', 'WiFi',
            'Alcohol', 'city', 'Monday_Open', 'Tuesday_Open', 'Wednesday_Open', 'Thursday_Open', 'Friday_Open',
            'Saturday_Open', 'Sunday_Open', 'Monday_Close', 'Tuesday_Close', 'Wednesday_Close', 'Thursday_Close',
            'Friday_Close', 'Saturday_Close', 'Sunday_Close']
train_test_set = _pd.get_dummies(train_test_set, columns=cat_cols, prefix=cat_cols)
train_test_set.shape
Out[16]:
(712379, 1689)
In [17]:
oe = _OrdinalEncoder()
In [21]:
train_test_set['postal_code'] = oe.fit_transform(train_test_set['postal_code'].to_numpy().reshape(-1, 1))
train_test_set.shape
Out[21]:
(712379, 1689)
In [22]:
categories = train_test_set['categories'].str.get_dummies(',')
f1 = lambda x: "categories_" + x
categories.rename(columns=f1, inplace=True)
train_test_set[categories.columns] = categories
train_test_set.drop(columns=['categories'], inplace=True)
train_test_set.shape
Out[22]:
(712379, 2774)

The resulting dataset

In [23]:
train_test_set.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 712379 entries, 0 to 153992
Columns: 2774 entries, review_id to categories_Yoga
dtypes: float64(54), int32(1), int64(1099), object(3), uint8(1617)
memory usage: 7.2+ GB
In [24]:
train_set = train_test_set[:_train_len]
test_set = train_test_set[_train_len:]
In [25]:
train_set.head(10)
Out[25]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Agincourt city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Aliquippa city_Allison Park city_Ambridge city_Amherst city_Ange-Gardien city_Anjou city_Anthem city_Apache Junction city_Arnold city_Aspinwall city_Auburn Township city_Auburn Twp city_Aurora city_Avalon city_Avon city_Avon Lake city_Avondale city_Bainbridge city_Bainbridge Township city_Baldwin city_Balzac city_Bath city_Bay Village city_Beachwood city_Beaconsfield city_Beauharnois city_Bedford city_Bedford Heights city_Beeton city_Belle Vernon city_Belleville city_Bellevue city_Bellvue city_Belmont city_Beloeil city_Ben Avon city_Berea city_Bethel Park city_Black Earth city_Blainville city_Blakeney city_Blawnox city_Blue Diamond city_Bois-des-Filion city_Boisbriand city_Bolton city_Boucherville city_Boulder City city_Braddock city_Bradford city_Bradford West Gwillimbury city_Brampton city_Bratenahl city_Brecksville city_Brentwood city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklin city_Brookline city_Brooklyn city_Brookpark city_Brossard city_Brownsburg-Chatham city_Brunswick city_Brunswick Hills city_Buckeye city_Buena Vista city_Burton city_Caledon city_Caledon East city_Caledon Village city_Calgary city_Candiac city_Canonsburd city_Canonsburg city_Carefree city_Carnegie city_Castle Shannon city_Cave Creek city_Cecil city_Central city_Central City city_Central City Village city_Chagrin Falls city_Chambly city_Champaign city_Champlain city_Chandler city_Chardon city_Chargrin Falls city_Charlemagne city_Charlotte city_Chatauguay city_Chateauguay city_Chesterland city_Chestermere city_Cheswick city_Châteauguay city_Clairton city_Clark city_Clarkson city_Cleveland city_Cleveland Heights city_Cleveland Hghts. city_Clover city_Columbia Station city_Communauté-Urbaine-de-Montréal city_Concord city_Concord Mills city_Concord Township city_Copley city_Coraopolis city_Cornelius city_Cote Saint-Luc city_Coteau-du-Lac city_Cottage Grove city_Crafton city_Cramerton city_Creighton city_Cross Plains city_Cuddy city_Cuyahoga Falls city_Cuyahoga Fls city_Cuyahoga Heights city_Dallas city_Dane city_Davidson city_De Forest city_De Winton city_DeForest city_Deerfield city_Deforest city_Delson city_Denver city_Dollard-Des Ormeaux city_Dollard-Des-Ormeaux city_Dollard-des Ormeaux city_Dollard-des-Ormeaux city_Don Mills city_Dorval city_Downtown city_Dravosburg city_Duquesne city_ETOBICOKE city_East Ajax city_East Cleveland city_East Gwillimbury city_East Liberty city_East Mc Keesport city_East McKeesport city_East Pittsburgh city_East York city_Eastlake city_Edgewood city_El Mirage city_Elizabeth city_Elizabeth Township city_Elrama city_Elyria city_Emsworth city_Enterprise city_Estérel city_Etibicoke city_Etna city_Etobicoke city_Etobiicoke city_Euclid city_Export city_Fabreville city_Fairlawn city_Fairport Harbor city_Fairview Park city_Finleyville city_Fisher city_Fitchburg city_Forest Hills city_Fort McDowell city_Fort Mcdowell city_Fort Mill city_Fountain Hills city_GILBERT city_GOODYEAR city_Garfield Heights city_Gastonia city_Gates Mills city_Georgetown city_Gibsonia city_Gifford city_Gilbert city_Glassport city_Glen Williams city_Glendale city_Glendale Az city_Glenshaw city_Goodwood city_Goodyear city_Grafton city_Grand River city_Green Tree city_Greenfield Park city_Guadalupe city_Halton Hills city_Hampstead city_Hampton Township city_Harmarville city_Harrisbug city_Harrisburg city_Harrison City city_Harwick city_Heidelberg city_Hemmingford city_Henderson city_Hendersonville city_Henryville city_Herminie city_Highland Heights city_Higley city_Hinckley city_Holland Landing city_Homer city_Homestead city_Hudson city_Huntersville city_Huntingdon city_Hyland Heights city_Iberville city_Imperial city_Independence city_Indian Land city_Indian Trail city_Indian land city_Indianola city_Inglewood city_Irwin city_Jefferson Hills city_Joliette city_Kahnawake city_Kannapolis city_Kennedy Township city_Kent city_King city_King City city_Kirkland city_Kirtland city_Kleinburg city_L'Assomption city_L'ile-Perrot city_L'Île-Bizard city_L'Île-Perrot city_LAS VEGAS city_La Prairie city_La Salle city_LaGrange city_LaSalle city_Lachine city_Lachute city_Lagrange city_Lake Park city_Lake Wylie city_Lakewood city_Las Vegas city_Las Vegas city_LasVegas city_Lasalle city_Laval city_Laval, Ste Dorothee city_Laveen city_Laveen Village city_Lawrence city_Lawrenceville city_Leetsdale city_Les Coteaux city_Les Cèdres city_Library city_Lindale city_Litchfield city_Litchfield Park city_Locust city_Longueuil city_Lorain city_Lowell city_Lower Burrell city_Lower Lawrenceville city_Lower burrell city_Lowesville city_Lyndhurst city_MESA city_Macedonia city_Madison city_Mahomet city_Malton city_Mantua city_Maple city_Maple Heights city_Markham city_Marshall city_Mascouche city_Matthews city_Mayfield city_Mayfield Heights city_Mayfield Village city_Mc Donald city_Mc Farland city_Mc Kees Rocks city_Mc Murray city_McAdenville city_McCandless city_McCandless Township city_McDonald city_McFarland city_McKees Rocks city_McKeesport city_McMurray city_Mcfarland city_Mckees Rocks city_Mckeesport city_Mcknight city_Mcmurray city_Medina city_Medina Township city_Mentor city_Mentor On The Lake city_Mentor On the Lake city_Mentor-on-the-Lake city_Mercier city_Mesa city_Mesa AZ city_Middleburg Heights city_Middlefield city_Middleton city_Midland city_Midnapore city_Midway city_Millvale city_Milton city_Mint Hill city_Mint Hill city_Mirabel city_Missisauga city_Mississauga city_Mississuaga city_Monona city_Monongahela city_Monroe city_Monroeville city_Mont-Royal city_Mont-Saint-Grégoire city_Mont-Saint-Hilaire city_Montgomery city_Monticello city_Montreal city_Montreal-Est city_Montreal-Nord city_Montreal-West city_Montrose city_Montréal city_Montréal-Nord city_Montréal-Ouest city_Montréal-West city_Mooers city_Moon city_Moon TWP city_Moon Township city_Mooresville city_Moreland Hills city_Morin-Heights city_Moseley city_Mount Albert city_Mount Holly city_Mount Horeb city_Mount Lebanon city_Mount Oliver city_Mount Washington city_Mt Holly city_Mt Lebanon city_Mt. Holly city_Mt. Lebanon city_Munhall city_Munroe Falls city_Murrysville city_N Las Vegas city_N Ridgeville city_N. Las Vegas city_NELLIS AFB city_Napierville city_Nellis AFB city_Nellis Air Force Base city_Neville Island city_New Eagle city_New Kensington city_Newbury city_Newmarket city_Nobleton city_North York city_North Braddock city_North Huntingdon city_North Huntington city_North Las Vegas city_North Olmstead city_North Olmsted city_North Randall city_North Ridgeville city_North Royalton city_North Strabane Township city_North Toronto city_North Versailles city_North York city_Northfield city_Northfield Center Township city_Northwest Calgary city_Norton city_Oak Ridges city_Oakdale city_Oakland city_Oakmont city_Oakville city_Oakwood city_Oakwood Village city_Ogden city_Olmsted Falls city_Olmsted Township city_Omaha city_Orange city_Orange Village city_Oregon city_Outremont city_PHOENIX city_Painesville city_Painesville Township city_Palgrave city_Paoli city_Paradise city_Paradise Valley city_Parma city_Parma Heights city_Paw Creek city_Peninsula city_Penn Hills city_Penn Hills Township city_Peoria city_Pepper Pike city_Peters Township city_Phoenix city_Phx city_Pickering city_Pierrefonds city_Pincourt city_Pineville city_Pitcairn city_Pittsburgh city_Pleasant Hills city_Plum city_Plum Boro city_Pointe-Aux-Trembles city_Pointe-Calumet city_Pointe-Claire city_Port Credit city_Port Vue city_Presto city_Queen Creek city_Québec city_Ranlo city_Rantoul city_Ravenna city_Rawdon city_Repentigny city_Rexdale city_Richfield city_Richmond Height city_Richmond Heights city_Richmond Hil city_Richmond Hill city_Richmond Hts city_Rigaud city_Rillton city_Rio Verde city_River Drive Park city_Robinson city_Robinson Township city_Robinson Twp. city_Rock Hill city_Rocky River city_Rocky View city_Rocky View County city_Rocky View No. 44 city_Rosemere city_Rosemère city_Ross city_Ross Township city_Rostraver city_Rouses Point city_Rural Ridge city_Russellton city_SCOTTSDALE city_Sagamore Hills city_Saint Joseph city_Saint Laurent city_Saint-Basile-Le-Grand city_Saint-Basile-le-Grand city_Saint-Bruno city_Saint-Bruno-de-Montarville city_Saint-Constant city_Saint-Eustache city_Saint-Henri city_Saint-Hippolyte city_Saint-Hubert city_Saint-Hyacinthe city_Saint-Jean-sur-Richelieu city_Saint-Jerome city_Saint-Jérôme city_Saint-Lambert city_Saint-Laurent city_Saint-Lazare city_Saint-Leonard city_Saint-Léonard city_Saint-Marc-sur-Richelieu city_Saint-Sauveur city_Saint-laurent city_Sainte-Adele city_Sainte-Adèle city_Sainte-Anne-De-Bellevue city_Sainte-Anne-de-Bellevue city_Sainte-Catherine city_Sainte-Genevieve city_Sainte-Julie city_Sainte-Marguerite-Esterel city_Sainte-Marguerite-du-lac-Masson city_Sainte-Marthe city_Sainte-Thérèse city_Sainte-Thérèse-de-Blainville city_Saintt-Bruno-de-Montarville city_Salaberry-De-Valleyfield city_Salaberry-de-Valleyfield city_Sauk City city_Savoy city_Scarborough city_Scarobrough city_Schomberg city_Schottsdale city_Scottdale city_Scottsdale city_Seven Hills city_Sewickley city_Shaker Heights city_Shaler Township city_Sharpsburg city_Sheffield city_Sheffield Lake city_Sheffield Village city_Shorewood Hills city_Solon city_South Amherst city_South Euclid city_South Las Vegas city_South Park city_South Park Township city_Spring Valley city_Springdale city_St Joseph city_St Leonard city_St-Benoît de Mirabel city_St-Bruno-de-Montarville city_St-Jerome city_St-Leonard city_St. Léonard city_Stallings city_Stanley city_Ste-Rose city_Stouffville city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sturgeon city_Summerlin city_Sun City city_Sun City West city_Sun Lakes city_Sun Praiie city_Sun Prairie city_Sunrise Manor city_Surprise city_Swissvale city_THORNHILL city_TORONTO city_Tallmadge city_Tarentum city_Tega Cay city_Tempe city_Terrebonne city_Thorncliffe Park city_Thornhil city_Thornhill city_Tolleson city_Tolono city_Tornto city_Toronto city_Toronto Scarborough city_Tottenham city_Trafford city_Tremont city_Troy Township city_Turtle Creek city_Tuscola city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Upper St Clair city_Urbana city_Uxbridge city_Val-Morin city_Valley City city_Valley View city_Vaudreuil-Dorion city_Vaughan city_Venetia city_Venise-en-Québec city_Verdun city_Verona city_Ville Mont-Royal city_Vimont city_Waddell city_Wadsworth city_Walton Hills city_Warrensville Heights city_Warrensville Hts. city_Waterloo city_Waunakee city_Waxhaw city_Weddington city_Wesley Chapel city_West Elizabeth city_West Homestead city_West Mifflin city_West View city_Westlake city_Westmount city_Wexford city_Whiitby city_Whitby city_Whitchurch-Stouffville city_White Oak city_Wickliffe city_Wilkins Township city_Wilkinsburg city_Willoughby city_Willoughby Hills city_Willowdale city_Willowick city_Wilmerding city_Windsor city_Woodbridge city_Woodmere city_Woodmere Village city_York city_Youngtown city_cave creek city_charlotte city_kirtland city_lyndhurst city_solon city_springdale city_verdun city_Île-des-Soeurs Monday_Open_00:00:00 Monday_Open_01:00:00 Monday_Open_02:00:00 Monday_Open_03:00:00 Monday_Open_03:30:00 Monday_Open_04:00:00 Monday_Open_04:15:00 Monday_Open_04:30:00 Monday_Open_04:45:00 Monday_Open_05:00:00 Monday_Open_05:30:00 Monday_Open_05:45:00 Monday_Open_06:00:00 Monday_Open_06:30:00 Monday_Open_06:45:00 Monday_Open_07:00:00 Monday_Open_07:30:00 Monday_Open_07:45:00 Monday_Open_08:00:00 Monday_Open_08:15:00 Monday_Open_08:30:00 Monday_Open_09:00:00 Monday_Open_09:15:00 Monday_Open_09:30:00 Monday_Open_09:45:00 Monday_Open_10:00:00 Monday_Open_10:15:00 Monday_Open_10:30:00 Monday_Open_10:45:00 Monday_Open_11:00:00 Monday_Open_11:30:00 Monday_Open_11:45:00 Monday_Open_12:00:00 Monday_Open_12:15:00 Monday_Open_12:30:00 Monday_Open_12:45:00 Monday_Open_13:00:00 Monday_Open_13:15:00 Monday_Open_13:30:00 Monday_Open_14:00:00 Monday_Open_14:30:00 Monday_Open_15:00:00 Monday_Open_15:30:00 Monday_Open_16:00:00 Monday_Open_16:15:00 Monday_Open_16:30:00 Monday_Open_16:45:00 Monday_Open_17:00:00 Monday_Open_17:30:00 Monday_Open_18:00:00 Monday_Open_18:30:00 Monday_Open_19:00:00 Monday_Open_20:00:00 Monday_Open_20:30:00 Monday_Open_21:00:00 Monday_Open_22:00:00 Monday_Open_22:30:00 Monday_Open_23:00:00 Monday_Open_NaT Tuesday_Open_00:00:00 Tuesday_Open_01:00:00 Tuesday_Open_03:00:00 Tuesday_Open_03:30:00 Tuesday_Open_04:00:00 Tuesday_Open_04:15:00 Tuesday_Open_04:30:00 Tuesday_Open_04:45:00 Tuesday_Open_05:00:00 Tuesday_Open_05:30:00 Tuesday_Open_05:45:00 Tuesday_Open_06:00:00 Tuesday_Open_06:30:00 Tuesday_Open_06:45:00 Tuesday_Open_07:00:00 Tuesday_Open_07:30:00 Tuesday_Open_07:45:00 Tuesday_Open_08:00:00 Tuesday_Open_08:15:00 Tuesday_Open_08:30:00 Tuesday_Open_09:00:00 Tuesday_Open_09:30:00 Tuesday_Open_10:00:00 Tuesday_Open_10:15:00 Tuesday_Open_10:30:00 Tuesday_Open_10:45:00 Tuesday_Open_11:00:00 Tuesday_Open_11:15:00 Tuesday_Open_11:30:00 Tuesday_Open_11:45:00 Tuesday_Open_12:00:00 Tuesday_Open_12:15:00 Tuesday_Open_12:30:00 Tuesday_Open_13:00:00 Tuesday_Open_13:30:00 Tuesday_Open_14:00:00 Tuesday_Open_14:30:00 Tuesday_Open_15:00:00 Tuesday_Open_15:30:00 Tuesday_Open_16:00:00 Tuesday_Open_16:15:00 Tuesday_Open_16:30:00 Tuesday_Open_17:00:00 Tuesday_Open_17:15:00 Tuesday_Open_17:30:00 Tuesday_Open_18:00:00 Tuesday_Open_18:30:00 Tuesday_Open_19:00:00 Tuesday_Open_20:00:00 Tuesday_Open_20:30:00 Tuesday_Open_21:00:00 Tuesday_Open_21:15:00 Tuesday_Open_22:00:00 Tuesday_Open_22:30:00 Tuesday_Open_23:00:00 Tuesday_Open_NaT Wednesday_Open_00:00:00 Wednesday_Open_00:15:00 Wednesday_Open_01:00:00 Wednesday_Open_01:30:00 Wednesday_Open_03:00:00 Wednesday_Open_03:15:00 Wednesday_Open_03:30:00 Wednesday_Open_04:00:00 Wednesday_Open_04:15:00 Wednesday_Open_04:30:00 Wednesday_Open_04:45:00 Wednesday_Open_05:00:00 Wednesday_Open_05:30:00 Wednesday_Open_05:45:00 Wednesday_Open_06:00:00 Wednesday_Open_06:30:00 Wednesday_Open_06:45:00 Wednesday_Open_07:00:00 Wednesday_Open_07:30:00 Wednesday_Open_07:45:00 Wednesday_Open_08:00:00 Wednesday_Open_08:15:00 Wednesday_Open_08:30:00 Wednesday_Open_09:00:00 Wednesday_Open_09:15:00 Wednesday_Open_09:30:00 Wednesday_Open_10:00:00 Wednesday_Open_10:15:00 Wednesday_Open_10:30:00 Wednesday_Open_10:45:00 Wednesday_Open_11:00:00 Wednesday_Open_11:15:00 Wednesday_Open_11:30:00 Wednesday_Open_11:45:00 Wednesday_Open_12:00:00 Wednesday_Open_12:15:00 Wednesday_Open_12:30:00 Wednesday_Open_13:00:00 Wednesday_Open_13:30:00 Wednesday_Open_14:00:00 Wednesday_Open_14:30:00 Wednesday_Open_15:00:00 Wednesday_Open_15:30:00 Wednesday_Open_16:00:00 Wednesday_Open_16:15:00 Wednesday_Open_16:30:00 Wednesday_Open_17:00:00 Wednesday_Open_17:15:00 Wednesday_Open_17:30:00 Wednesday_Open_18:00:00 Wednesday_Open_18:30:00 Wednesday_Open_19:00:00 Wednesday_Open_19:30:00 Wednesday_Open_20:00:00 Wednesday_Open_20:30:00 Wednesday_Open_21:00:00 Wednesday_Open_22:00:00 Wednesday_Open_22:30:00 Wednesday_Open_23:00:00 Wednesday_Open_NaT Thursday_Open_00:00:00 Thursday_Open_00:15:00 Thursday_Open_01:00:00 Thursday_Open_02:00:00 Thursday_Open_03:00:00 Thursday_Open_03:30:00 Thursday_Open_04:00:00 Thursday_Open_04:15:00 Thursday_Open_04:30:00 Thursday_Open_04:45:00 Thursday_Open_05:00:00 Thursday_Open_05:30:00 Thursday_Open_05:45:00 Thursday_Open_06:00:00 Thursday_Open_06:30:00 Thursday_Open_06:45:00 Thursday_Open_07:00:00 Thursday_Open_07:30:00 Thursday_Open_07:45:00 Thursday_Open_08:00:00 Thursday_Open_08:15:00 Thursday_Open_08:30:00 Thursday_Open_09:00:00 Thursday_Open_09:15:00 Thursday_Open_09:30:00 Thursday_Open_10:00:00 Thursday_Open_10:15:00 Thursday_Open_10:30:00 Thursday_Open_10:45:00 Thursday_Open_11:00:00 Thursday_Open_11:15:00 Thursday_Open_11:30:00 Thursday_Open_11:45:00 Thursday_Open_12:00:00 Thursday_Open_12:15:00 Thursday_Open_12:30:00 Thursday_Open_13:00:00 Thursday_Open_13:30:00 Thursday_Open_14:00:00 Thursday_Open_14:30:00 Thursday_Open_15:00:00 Thursday_Open_15:30:00 Thursday_Open_16:00:00 Thursday_Open_16:15:00 Thursday_Open_16:30:00 Thursday_Open_17:00:00 Thursday_Open_17:15:00 Thursday_Open_17:30:00 Thursday_Open_17:45:00 Thursday_Open_18:00:00 Thursday_Open_18:30:00 Thursday_Open_19:00:00 Thursday_Open_19:30:00 Thursday_Open_20:00:00 Thursday_Open_20:30:00 Thursday_Open_21:00:00 Thursday_Open_21:30:00 Thursday_Open_22:00:00 Thursday_Open_22:30:00 Thursday_Open_23:00:00 Thursday_Open_23:30:00 Thursday_Open_NaT Friday_Open_00:00:00 Friday_Open_01:00:00 Friday_Open_03:00:00 Friday_Open_03:30:00 Friday_Open_04:00:00 Friday_Open_04:15:00 Friday_Open_04:30:00 Friday_Open_04:45:00 Friday_Open_05:00:00 Friday_Open_05:30:00 Friday_Open_05:45:00 Friday_Open_06:00:00 Friday_Open_06:30:00 Friday_Open_06:45:00 Friday_Open_07:00:00 Friday_Open_07:30:00 Friday_Open_07:45:00 Friday_Open_08:00:00 Friday_Open_08:15:00 Friday_Open_08:30:00 Friday_Open_09:00:00 Friday_Open_09:15:00 Friday_Open_09:30:00 Friday_Open_10:00:00 Friday_Open_10:15:00 Friday_Open_10:30:00 Friday_Open_10:45:00 Friday_Open_11:00:00 Friday_Open_11:15:00 Friday_Open_11:30:00 Friday_Open_11:45:00 Friday_Open_12:00:00 Friday_Open_12:15:00 Friday_Open_12:30:00 Friday_Open_13:00:00 Friday_Open_13:30:00 Friday_Open_14:00:00 Friday_Open_14:30:00 Friday_Open_15:00:00 Friday_Open_15:30:00 Friday_Open_16:00:00 Friday_Open_16:15:00 Friday_Open_16:30:00 Friday_Open_17:00:00 Friday_Open_17:15:00 Friday_Open_17:30:00 Friday_Open_18:00:00 Friday_Open_18:30:00 Friday_Open_19:00:00 Friday_Open_19:30:00 Friday_Open_20:00:00 Friday_Open_20:30:00 Friday_Open_21:00:00 Friday_Open_22:00:00 Friday_Open_22:30:00 Friday_Open_23:00:00 Friday_Open_23:30:00 Friday_Open_NaT Saturday_Open_00:00:00 Saturday_Open_01:00:00 Saturday_Open_02:30:00 Saturday_Open_03:00:00 Saturday_Open_03:30:00 Saturday_Open_04:00:00 Saturday_Open_04:30:00 Saturday_Open_04:45:00 Saturday_Open_05:00:00 Saturday_Open_05:30:00 Saturday_Open_05:45:00 Saturday_Open_06:00:00 Saturday_Open_06:30:00 Saturday_Open_06:45:00 Saturday_Open_07:00:00 Saturday_Open_07:30:00 Saturday_Open_08:00:00 Saturday_Open_08:15:00 Saturday_Open_08:30:00 Saturday_Open_09:00:00 Saturday_Open_09:30:00 Saturday_Open_10:00:00 Saturday_Open_10:15:00 Saturday_Open_10:30:00 Saturday_Open_10:45:00 Saturday_Open_11:00:00 Saturday_Open_11:15:00 Saturday_Open_11:30:00 Saturday_Open_11:45:00 Saturday_Open_12:00:00 Saturday_Open_12:15:00 Saturday_Open_12:30:00 Saturday_Open_13:00:00 Saturday_Open_13:30:00 Saturday_Open_14:00:00 Saturday_Open_14:30:00 Saturday_Open_15:00:00 Saturday_Open_15:30:00 Saturday_Open_16:00:00 Saturday_Open_16:15:00 Saturday_Open_16:30:00 Saturday_Open_17:00:00 Saturday_Open_17:30:00 Saturday_Open_18:00:00 Saturday_Open_18:15:00 Saturday_Open_18:30:00 Saturday_Open_19:00:00 Saturday_Open_19:30:00 Saturday_Open_19:45:00 Saturday_Open_20:00:00 Saturday_Open_20:30:00 Saturday_Open_21:00:00 Saturday_Open_21:15:00 Saturday_Open_21:30:00 Saturday_Open_22:00:00 Saturday_Open_22:30:00 Saturday_Open_23:00:00 Saturday_Open_23:30:00 Saturday_Open_NaT Sunday_Open_00:00:00 Sunday_Open_00:30:00 Sunday_Open_01:00:00 Sunday_Open_02:00:00 Sunday_Open_02:30:00 Sunday_Open_03:00:00 Sunday_Open_03:30:00 Sunday_Open_04:00:00 Sunday_Open_04:30:00 Sunday_Open_04:45:00 Sunday_Open_05:00:00 Sunday_Open_05:30:00 Sunday_Open_05:45:00 Sunday_Open_06:00:00 Sunday_Open_06:30:00 Sunday_Open_06:45:00 Sunday_Open_07:00:00 Sunday_Open_07:30:00 Sunday_Open_07:45:00 Sunday_Open_08:00:00 Sunday_Open_08:15:00 Sunday_Open_08:30:00 Sunday_Open_09:00:00 Sunday_Open_09:30:00 Sunday_Open_09:45:00 Sunday_Open_10:00:00 Sunday_Open_10:15:00 Sunday_Open_10:30:00 Sunday_Open_10:45:00 Sunday_Open_11:00:00 Sunday_Open_11:15:00 Sunday_Open_11:30:00 Sunday_Open_11:45:00 Sunday_Open_12:00:00 Sunday_Open_12:15:00 Sunday_Open_12:30:00 Sunday_Open_13:00:00 Sunday_Open_13:15:00 Sunday_Open_13:30:00 Sunday_Open_14:00:00 Sunday_Open_14:30:00 Sunday_Open_15:00:00 Sunday_Open_15:30:00 Sunday_Open_16:00:00 Sunday_Open_16:15:00 Sunday_Open_16:30:00 Sunday_Open_17:00:00 Sunday_Open_17:15:00 Sunday_Open_17:30:00 Sunday_Open_18:00:00 Sunday_Open_18:30:00 Sunday_Open_19:00:00 Sunday_Open_19:30:00 Sunday_Open_20:00:00 Sunday_Open_20:30:00 Sunday_Open_21:00:00 Sunday_Open_21:30:00 Sunday_Open_22:00:00 Sunday_Open_22:30:00 Sunday_Open_23:00:00 Sunday_Open_23:30:00 Sunday_Open_NaT Monday_Close_00:00:00 Monday_Close_00:15:00 Monday_Close_00:30:00 Monday_Close_01:00:00 Monday_Close_01:10:00 Monday_Close_01:15:00 Monday_Close_01:30:00 Monday_Close_01:45:00 Monday_Close_02:00:00 Monday_Close_02:30:00 Monday_Close_02:45:00 Monday_Close_03:00:00 Monday_Close_03:30:00 Monday_Close_03:59:00 Monday_Close_04:00:00 Monday_Close_04:30:00 Monday_Close_04:45:00 Monday_Close_05:00:00 Monday_Close_05:30:00 Monday_Close_06:00:00 Monday_Close_06:30:00 Monday_Close_07:00:00 Monday_Close_08:00:00 Monday_Close_08:30:00 Monday_Close_09:00:00 Monday_Close_09:30:00 Monday_Close_10:00:00 Monday_Close_10:15:00 Monday_Close_11:00:00 Monday_Close_11:15:00 Monday_Close_11:30:00 Monday_Close_12:00:00 Monday_Close_12:30:00 Monday_Close_13:00:00 Monday_Close_13:30:00 Monday_Close_14:00:00 Monday_Close_14:30:00 Monday_Close_14:45:00 Monday_Close_15:00:00 Monday_Close_15:30:00 Monday_Close_16:00:00 Monday_Close_16:30:00 Monday_Close_16:45:00 Monday_Close_17:00:00 Monday_Close_17:15:00 Monday_Close_17:30:00 Monday_Close_18:00:00 Monday_Close_18:30:00 Monday_Close_19:00:00 Monday_Close_19:30:00 Monday_Close_19:45:00 Monday_Close_20:00:00 Monday_Close_20:30:00 Monday_Close_20:35:00 Monday_Close_20:45:00 Monday_Close_21:00:00 Monday_Close_21:15:00 Monday_Close_21:30:00 Monday_Close_21:45:00 Monday_Close_22:00:00 Monday_Close_22:15:00 Monday_Close_22:30:00 Monday_Close_22:45:00 Monday_Close_23:00:00 Monday_Close_23:15:00 Monday_Close_23:30:00 Monday_Close_23:45:00 Monday_Close_23:59:00 Monday_Close_NaT Tuesday_Close_00:00:00 Tuesday_Close_00:15:00 Tuesday_Close_00:30:00 Tuesday_Close_00:45:00 Tuesday_Close_01:00:00 Tuesday_Close_01:10:00 Tuesday_Close_01:15:00 Tuesday_Close_01:30:00 Tuesday_Close_01:45:00 Tuesday_Close_02:00:00 Tuesday_Close_02:30:00 Tuesday_Close_03:00:00 Tuesday_Close_03:30:00 Tuesday_Close_03:59:00 Tuesday_Close_04:00:00 Tuesday_Close_04:30:00 Tuesday_Close_04:45:00 Tuesday_Close_05:00:00 Tuesday_Close_05:30:00 Tuesday_Close_06:00:00 Tuesday_Close_07:00:00 Tuesday_Close_08:00:00 Tuesday_Close_08:30:00 Tuesday_Close_09:00:00 Tuesday_Close_09:30:00 Tuesday_Close_10:00:00 Tuesday_Close_10:15:00 Tuesday_Close_10:30:00 Tuesday_Close_11:00:00 Tuesday_Close_11:30:00 Tuesday_Close_12:00:00 Tuesday_Close_12:30:00 Tuesday_Close_12:45:00 Tuesday_Close_13:00:00 Tuesday_Close_13:30:00 Tuesday_Close_14:00:00 Tuesday_Close_14:30:00 Tuesday_Close_14:45:00 Tuesday_Close_15:00:00 Tuesday_Close_15:30:00 Tuesday_Close_15:45:00 Tuesday_Close_16:00:00 Tuesday_Close_16:30:00 Tuesday_Close_16:45:00 Tuesday_Close_17:00:00 Tuesday_Close_17:15:00 Tuesday_Close_17:30:00 Tuesday_Close_18:00:00 Tuesday_Close_18:30:00 Tuesday_Close_19:00:00 Tuesday_Close_19:30:00 Tuesday_Close_19:45:00 Tuesday_Close_20:00:00 Tuesday_Close_20:15:00 Tuesday_Close_20:30:00 Tuesday_Close_20:35:00 Tuesday_Close_20:45:00 Tuesday_Close_21:00:00 Tuesday_Close_21:15:00 Tuesday_Close_21:30:00 Tuesday_Close_21:45:00 Tuesday_Close_22:00:00 Tuesday_Close_22:15:00 Tuesday_Close_22:30:00 Tuesday_Close_22:45:00 Tuesday_Close_23:00:00 Tuesday_Close_23:30:00 Tuesday_Close_23:45:00 Tuesday_Close_23:59:00 Tuesday_Close_NaT Wednesday_Close_00:00:00 Wednesday_Close_00:15:00 Wednesday_Close_00:30:00 Wednesday_Close_00:45:00 Wednesday_Close_01:00:00 Wednesday_Close_01:10:00 Wednesday_Close_01:15:00 Wednesday_Close_01:30:00 Wednesday_Close_01:45:00 Wednesday_Close_02:00:00 Wednesday_Close_02:30:00 Wednesday_Close_02:45:00 Wednesday_Close_03:00:00 Wednesday_Close_03:30:00 Wednesday_Close_03:59:00 Wednesday_Close_04:00:00 Wednesday_Close_04:30:00 Wednesday_Close_04:45:00 Wednesday_Close_05:00:00 Wednesday_Close_05:30:00 Wednesday_Close_06:00:00 Wednesday_Close_07:00:00 Wednesday_Close_07:45:00 Wednesday_Close_08:00:00 Wednesday_Close_08:30:00 Wednesday_Close_09:00:00 Wednesday_Close_10:00:00 Wednesday_Close_10:30:00 Wednesday_Close_11:00:00 Wednesday_Close_11:15:00 Wednesday_Close_11:30:00 Wednesday_Close_12:00:00 Wednesday_Close_12:30:00 Wednesday_Close_12:45:00 Wednesday_Close_13:00:00 Wednesday_Close_13:30:00 Wednesday_Close_14:00:00 Wednesday_Close_14:30:00 Wednesday_Close_14:45:00 Wednesday_Close_15:00:00 Wednesday_Close_15:30:00 Wednesday_Close_16:00:00 Wednesday_Close_16:30:00 Wednesday_Close_17:00:00 Wednesday_Close_17:30:00 Wednesday_Close_18:00:00 Wednesday_Close_18:30:00 Wednesday_Close_18:45:00 Wednesday_Close_19:00:00 Wednesday_Close_19:30:00 Wednesday_Close_19:45:00 Wednesday_Close_20:00:00 Wednesday_Close_20:15:00 Wednesday_Close_20:30:00 Wednesday_Close_20:35:00 Wednesday_Close_20:45:00 Wednesday_Close_21:00:00 Wednesday_Close_21:15:00 Wednesday_Close_21:30:00 Wednesday_Close_21:45:00 Wednesday_Close_22:00:00 Wednesday_Close_22:15:00 Wednesday_Close_22:30:00 Wednesday_Close_22:45:00 Wednesday_Close_23:00:00 Wednesday_Close_23:15:00 Wednesday_Close_23:30:00 Wednesday_Close_23:45:00 Wednesday_Close_23:59:00 Wednesday_Close_NaT Thursday_Close_00:00:00 Thursday_Close_00:15:00 Thursday_Close_00:30:00 Thursday_Close_00:45:00 Thursday_Close_01:00:00 Thursday_Close_01:10:00 Thursday_Close_01:15:00 Thursday_Close_01:30:00 Thursday_Close_01:45:00 Thursday_Close_02:00:00 Thursday_Close_02:30:00 Thursday_Close_02:45:00 Thursday_Close_03:00:00 Thursday_Close_03:30:00 Thursday_Close_03:59:00 Thursday_Close_04:00:00 Thursday_Close_04:15:00 Thursday_Close_04:30:00 Thursday_Close_04:45:00 Thursday_Close_05:00:00 Thursday_Close_05:30:00 Thursday_Close_06:00:00 Thursday_Close_07:00:00 Thursday_Close_07:45:00 Thursday_Close_08:00:00 Thursday_Close_08:30:00 Thursday_Close_09:00:00 Thursday_Close_09:30:00 Thursday_Close_10:00:00 Thursday_Close_10:30:00 Thursday_Close_11:00:00 Thursday_Close_11:15:00 Thursday_Close_11:30:00 Thursday_Close_11:45:00 Thursday_Close_12:00:00 Thursday_Close_12:30:00 Thursday_Close_12:45:00 Thursday_Close_13:00:00 Thursday_Close_13:30:00 Thursday_Close_14:00:00 Thursday_Close_14:30:00 Thursday_Close_14:45:00 Thursday_Close_15:00:00 Thursday_Close_15:15:00 Thursday_Close_15:30:00 Thursday_Close_16:00:00 Thursday_Close_16:30:00 Thursday_Close_17:00:00 Thursday_Close_17:15:00 Thursday_Close_17:30:00 Thursday_Close_17:45:00 Thursday_Close_18:00:00 Thursday_Close_18:15:00 Thursday_Close_18:30:00 Thursday_Close_18:45:00 Thursday_Close_19:00:00 Thursday_Close_19:30:00 Thursday_Close_19:45:00 Thursday_Close_20:00:00 Thursday_Close_20:15:00 Thursday_Close_20:30:00 Thursday_Close_20:45:00 Thursday_Close_21:00:00 Thursday_Close_21:15:00 Thursday_Close_21:30:00 Thursday_Close_21:35:00 Thursday_Close_21:45:00 Thursday_Close_22:00:00 Thursday_Close_22:15:00 Thursday_Close_22:30:00 Thursday_Close_22:45:00 Thursday_Close_23:00:00 Thursday_Close_23:15:00 Thursday_Close_23:30:00 Thursday_Close_23:45:00 Thursday_Close_23:59:00 Thursday_Close_NaT Friday_Close_00:00:00 Friday_Close_00:15:00 Friday_Close_00:30:00 Friday_Close_00:45:00 Friday_Close_01:00:00 Friday_Close_01:10:00 Friday_Close_01:20:00 Friday_Close_01:30:00 Friday_Close_01:45:00 Friday_Close_02:00:00 Friday_Close_02:15:00 Friday_Close_02:30:00 Friday_Close_02:45:00 Friday_Close_03:00:00 Friday_Close_03:30:00 Friday_Close_03:45:00 Friday_Close_03:59:00 Friday_Close_04:00:00 Friday_Close_04:15:00 Friday_Close_04:30:00 Friday_Close_04:45:00 Friday_Close_05:00:00 Friday_Close_05:30:00 Friday_Close_06:00:00 Friday_Close_06:30:00 Friday_Close_07:00:00 Friday_Close_07:45:00 Friday_Close_08:00:00 Friday_Close_08:30:00 Friday_Close_09:00:00 Friday_Close_10:00:00 Friday_Close_10:15:00 Friday_Close_10:30:00 Friday_Close_11:00:00 Friday_Close_11:15:00 Friday_Close_11:30:00 Friday_Close_12:00:00 Friday_Close_12:15:00 Friday_Close_12:30:00 Friday_Close_12:45:00 Friday_Close_13:00:00 Friday_Close_13:30:00 Friday_Close_14:00:00 Friday_Close_14:30:00 Friday_Close_14:45:00 Friday_Close_15:00:00 Friday_Close_15:30:00 Friday_Close_16:00:00 Friday_Close_16:30:00 Friday_Close_16:45:00 Friday_Close_17:00:00 Friday_Close_17:15:00 Friday_Close_17:30:00 Friday_Close_18:00:00 Friday_Close_18:30:00 Friday_Close_19:00:00 Friday_Close_19:30:00 Friday_Close_19:45:00 Friday_Close_20:00:00 Friday_Close_20:15:00 Friday_Close_20:30:00 Friday_Close_20:45:00 Friday_Close_21:00:00 Friday_Close_21:15:00 Friday_Close_21:30:00 Friday_Close_21:35:00 Friday_Close_21:45:00 Friday_Close_22:00:00 Friday_Close_22:15:00 Friday_Close_22:30:00 Friday_Close_22:45:00 Friday_Close_23:00:00 Friday_Close_23:15:00 Friday_Close_23:30:00 Friday_Close_23:45:00 Friday_Close_23:59:00 Friday_Close_NaT Saturday_Close_00:00:00 Saturday_Close_00:15:00 Saturday_Close_00:30:00 Saturday_Close_00:45:00 Saturday_Close_01:00:00 Saturday_Close_01:10:00 Saturday_Close_01:20:00 Saturday_Close_01:30:00 Saturday_Close_01:45:00 Saturday_Close_02:00:00 Saturday_Close_02:15:00 Saturday_Close_02:30:00 Saturday_Close_02:45:00 Saturday_Close_03:00:00 Saturday_Close_03:30:00 Saturday_Close_03:45:00 Saturday_Close_03:59:00 Saturday_Close_04:00:00 Saturday_Close_04:15:00 Saturday_Close_04:30:00 Saturday_Close_04:45:00 Saturday_Close_05:00:00 Saturday_Close_05:30:00 Saturday_Close_06:00:00 Saturday_Close_06:30:00 Saturday_Close_07:00:00 Saturday_Close_08:00:00 Saturday_Close_08:30:00 Saturday_Close_09:00:00 Saturday_Close_10:00:00 Saturday_Close_10:30:00 Saturday_Close_11:00:00 Saturday_Close_11:15:00 Saturday_Close_11:30:00 Saturday_Close_11:45:00 Saturday_Close_12:00:00 Saturday_Close_12:30:00 Saturday_Close_12:45:00 Saturday_Close_13:00:00 Saturday_Close_13:30:00 Saturday_Close_14:00:00 Saturday_Close_14:30:00 Saturday_Close_14:45:00 Saturday_Close_15:00:00 Saturday_Close_15:30:00 Saturday_Close_15:45:00 Saturday_Close_16:00:00 Saturday_Close_16:30:00 Saturday_Close_16:45:00 Saturday_Close_17:00:00 Saturday_Close_17:30:00 Saturday_Close_17:45:00 Saturday_Close_18:00:00 Saturday_Close_18:30:00 Saturday_Close_19:00:00 Saturday_Close_19:15:00 Saturday_Close_19:30:00 Saturday_Close_19:45:00 Saturday_Close_20:00:00 Saturday_Close_20:30:00 Saturday_Close_20:45:00 Saturday_Close_21:00:00 Saturday_Close_21:15:00 Saturday_Close_21:30:00 Saturday_Close_21:35:00 Saturday_Close_21:45:00 Saturday_Close_22:00:00 Saturday_Close_22:15:00 Saturday_Close_22:30:00 Saturday_Close_22:45:00 Saturday_Close_23:00:00 Saturday_Close_23:15:00 Saturday_Close_23:30:00 Saturday_Close_23:45:00 Saturday_Close_23:59:00 Saturday_Close_NaT Sunday_Close_00:00:00 Sunday_Close_00:15:00 Sunday_Close_00:30:00 Sunday_Close_01:00:00 Sunday_Close_01:10:00 Sunday_Close_01:15:00 Sunday_Close_01:30:00 Sunday_Close_01:45:00 Sunday_Close_02:00:00 Sunday_Close_02:15:00 Sunday_Close_02:30:00 Sunday_Close_03:00:00 Sunday_Close_03:30:00 Sunday_Close_03:59:00 Sunday_Close_04:00:00 Sunday_Close_04:30:00 Sunday_Close_04:45:00 Sunday_Close_05:00:00 Sunday_Close_05:30:00 Sunday_Close_06:00:00 Sunday_Close_07:00:00 Sunday_Close_08:00:00 Sunday_Close_08:30:00 Sunday_Close_09:00:00 Sunday_Close_09:30:00 Sunday_Close_10:00:00 Sunday_Close_10:30:00 Sunday_Close_11:00:00 Sunday_Close_11:15:00 Sunday_Close_11:30:00 Sunday_Close_12:00:00 Sunday_Close_12:15:00 Sunday_Close_12:30:00 Sunday_Close_12:45:00 Sunday_Close_13:00:00 Sunday_Close_13:30:00 Sunday_Close_14:00:00 Sunday_Close_14:30:00 Sunday_Close_14:45:00 Sunday_Close_15:00:00 Sunday_Close_15:30:00 Sunday_Close_15:45:00 Sunday_Close_16:00:00 Sunday_Close_16:30:00 Sunday_Close_16:45:00 Sunday_Close_17:00:00 Sunday_Close_17:30:00 Sunday_Close_18:00:00 Sunday_Close_18:15:00 Sunday_Close_18:30:00 Sunday_Close_19:00:00 Sunday_Close_19:30:00 Sunday_Close_19:45:00 Sunday_Close_20:00:00 Sunday_Close_20:30:00 Sunday_Close_20:35:00 Sunday_Close_20:45:00 Sunday_Close_21:00:00 Sunday_Close_21:15:00 Sunday_Close_21:30:00 Sunday_Close_21:45:00 Sunday_Close_22:00:00 Sunday_Close_22:15:00 Sunday_Close_22:30:00 Sunday_Close_22:45:00 Sunday_Close_23:00:00 Sunday_Close_23:15:00 Sunday_Close_23:30:00 Sunday_Close_23:45:00 Sunday_Close_23:59:00 Sunday_Close_NaT categories_ & Probates categories_ Acai Bowls categories_ Accessories categories_ Accountants categories_ Active Life categories_ Acupuncture categories_ Adult Education categories_ Adult Entertainment categories_ Advertising categories_ Afghan categories_ African categories_ Air Duct Cleaning categories_ Aircraft Repairs categories_ Airport Lounges categories_ Airport Shuttles categories_ Airport Terminals categories_ Airports categories_ Airsoft categories_ Amateur Sports Teams categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Assisted Therapy categories_ Animal Physical Therapy categories_ Animal Shelters categories_ Antiques categories_ Apartments categories_ Appliances categories_ Appliances & Repair categories_ Aquarium Services categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Classes categories_ Art Galleries categories_ Art Museums categories_ Art Schools categories_ Art Supplies categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Audio/Visual Equipment Rental categories_ Australian categories_ Austrian categories_ Auto Customization categories_ Auto Detailing categories_ Auto Glass Services categories_ Auto Insurance categories_ Auto Parts & Supplies categories_ Auto Repair categories_ Auto Upholstery categories_ Automotive categories_ Baby Gear & Furniture categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Bankruptcy Law categories_ Bar Crawl categories_ Barbeque categories_ Barbers categories_ Bars categories_ Bartenders categories_ Basque categories_ Batting Cages categories_ Beach Bars categories_ Beaches categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Beer Bar categories_ Beer Garden categories_ Beer Gardens categories_ Beer Hall categories_ Belgian categories_ Bespoke Clothing categories_ Beverage Store categories_ Bike Rentals categories_ Bike Repair/Maintenance categories_ Bikes categories_ Bingo Halls categories_ Bistros categories_ Blow Dry/Out Services categories_ Boat Charters categories_ Boat Dealers categories_ Boat Repair categories_ Boating categories_ Bocce Ball categories_ Body Shops categories_ Books categories_ Bookstores categories_ Botanical Gardens categories_ Bounce House Rentals categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Brazilian Jiu-jitsu categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ Bridal categories_ British categories_ Bubble Tea categories_ Buffets categories_ Building Supplies categories_ Burgers categories_ Burmese categories_ Bus Tours categories_ Business Consulting categories_ Butcher categories_ CSA categories_ Cabaret categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Campgrounds categories_ Canadian (New) categories_ Candy Stores categories_ Cannabis Clinics categories_ Cannabis Collective categories_ Cannabis Dispensaries categories_ Cantonese categories_ Car Dealers categories_ Car Rental categories_ Car Share Services categories_ Car Wash categories_ Car Window Tinting categories_ Cardiologists categories_ Cards & Stationery categories_ Caribbean categories_ Carpet Installation categories_ Casinos categories_ Caterers categories_ Champagne Bars categories_ Check Cashing/Pay-day Loans categories_ Cheese Shops categories_ Cheese Tasting Classes categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chiropractors categories_ Chocolatiers & Shops categories_ Christmas Trees categories_ Churches categories_ Churros categories_ Cideries categories_ Cigar Bars categories_ Cinema categories_ Climbing categories_ Clothing Rental categories_ Clowns categories_ Club Crawl categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee & Tea Supplies categories_ Coffee Roasteries categories_ Coffeeshops categories_ Colleges & Universities categories_ Colombian categories_ Comedy Clubs categories_ Comfort Food categories_ Comic Books categories_ Commercial Truck Repair categories_ Community Centers categories_ Community Service/Non-Profit categories_ Computers categories_ Contractors categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Cooking Schools categories_ Cosmetic Dentists categories_ Cosmetics & Beauty Supply categories_ Counseling & Mental Health categories_ Country Clubs categories_ Country Dance Halls categories_ Couriers & Delivery Services categories_ Creperies categories_ Cuban categories_ Cultural Center categories_ Cupcakes categories_ Currency Exchange categories_ Custom Cakes categories_ Czech categories_ DJs categories_ Dance Clubs categories_ Dance Schools categories_ Day Camps categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Dentists categories_ Department Stores categories_ Desserts categories_ Diagnostic Services categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Discount Store categories_ Distilleries categories_ Dive Bars categories_ Divorce & Family Law categories_ Do-It-Yourself Food categories_ Doctors categories_ Dog Walkers categories_ Dominican categories_ Donairs categories_ Donuts categories_ Door Sales/Installation categories_ Drive-Thru Bars categories_ Drugstores categories_ Dry Cleaning categories_ Dry Cleaning & Laundry categories_ Drywall Installation & Repair categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Electronics categories_ Emergency Medicine categories_ Empanadas categories_ Employment Agencies categories_ Engraving categories_ Escape Games categories_ Estate Planning Law categories_ Ethical Grocery categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyebrow Services categories_ Eyelash Service categories_ Eyewear & Opticians categories_ Falafel categories_ Farmers Market categories_ Farms categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Financial Services categories_ Fireplace Services categories_ Fish & Chips categories_ Fitness & Instruction categories_ Flea Markets categories_ Flooring categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food categories_ Food Banks categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ Formal Wear categories_ Foundation Repair categories_ French categories_ Fruits & Veggies categories_ Fur Clothing categories_ Furniture Rental categories_ Furniture Repair categories_ Furniture Reupholstery categories_ Furniture Stores categories_ Game Meat categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ General Dentistry categories_ German categories_ Gift Shops categories_ Glass & Mirrors categories_ Gluten-Free categories_ Golf categories_ Golf Cart Dealers categories_ Golf Lessons categories_ Graphic Design categories_ Greek categories_ Grilling Equipment categories_ Grocery categories_ Guamanian categories_ Guest Houses categories_ Gun/Rifle Ranges categories_ Gutter Services categories_ Gyms categories_ Hainan categories_ Hair Extensions categories_ Hair Removal categories_ Hair Salons categories_ Hair Stylists categories_ Haitian categories_ Hakka categories_ Halal categories_ Handyman categories_ Hardware Stores categories_ Hats categories_ Hawaiian categories_ Head Shops categories_ Health & Medical categories_ Health Markets categories_ Health Retreats categories_ Heating & Air Conditioning/HVAC categories_ Herbs & Spices categories_ Himalayan/Nepalese categories_ Historical Tours categories_ Hobby Shops categories_ Holiday Decorations categories_ Holistic Animal Care categories_ Home & Garden categories_ Home Cleaning categories_ Home Decor categories_ Home Health Care categories_ Home Services categories_ Home Window Tinting categories_ Honduran categories_ Honey categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Horse Racing categories_ Horseback Riding categories_ Hospitals categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hot Tub & Pool categories_ Hotel bar categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Iberian categories_ Ice Cream & Frozen Yogurt categories_ Ice Delivery categories_ Immigration Law categories_ Imported Food categories_ Indian categories_ Indonesian categories_ Indoor Playcentre categories_ Insurance categories_ Interior Design categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Irish Pub categories_ Italian categories_ Izakaya categories_ Japanese categories_ Japanese Curry categories_ Jazz & Blues categories_ Jewelry categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kids Hair Salons categories_ Kitchen & Bath categories_ Knife Sharpening categories_ Kombucha categories_ Korean categories_ Kosher categories_ Laboratory Testing categories_ Lakes categories_ Landmarks & Historical Buildings categories_ Landscaping categories_ Laotian categories_ Laser Hair Removal categories_ Laser Tag categories_ Latin American categories_ Laundromat categories_ Laundry Services categories_ Lawyers categories_ Leather Goods categories_ Lebanese categories_ Leisure Centers categories_ Libraries categories_ Life Coach categories_ Lighting Fixtures & Equipment categories_ Limos categories_ Live/Raw Food categories_ Local Fish Stores categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Magicians categories_ Mags categories_ Makeup Artists categories_ Malaysian categories_ Marinas categories_ Marketing categories_ Martial Arts categories_ Masonry/Concrete categories_ Mass Media categories_ Massage categories_ Massage Therapy categories_ Mattresses categories_ Mauritius categories_ Meat Shops categories_ Medical Cannabis Referrals categories_ Medical Centers categories_ Medical Spas categories_ Meditation Centers categories_ Mediterranean categories_ Men's Clothing categories_ Mexican categories_ Middle Eastern categories_ Minho categories_ Mini Golf categories_ Mobile Phones categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Mortgage Brokers categories_ Motorcycle Repair categories_ Movers categories_ Museums categories_ Music & DVDs categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nicaraguan categories_ Nightlife categories_ Noodles categories_ Nurseries & Gardening categories_ Nutritionists categories_ Oaxacan categories_ Observatories categories_ Occupational Therapy categories_ Officiants categories_ Oil Change Stations categories_ Olive Oil categories_ Opera & Ballet categories_ Ophthalmologists categories_ Optometrists categories_ Organic Stores categories_ Outdoor Furniture Stores categories_ Outdoor Gear categories_ Outlet Stores categories_ Paint & Sip categories_ Paint-Your-Own Pottery categories_ Painters categories_ Pakistani categories_ Pan Asian categories_ Parenting Classes categories_ Parks categories_ Party & Event Planning categories_ Party Bus Rentals categories_ Party Equipment Rentals categories_ Party Supplies categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Pawn Shops categories_ Pediatricians categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Assistants categories_ Personal Chefs categories_ Personal Injury Law categories_ Personal Shopping categories_ Peruvian categories_ Pet Adoption categories_ Pet Boarding categories_ Pet Groomers categories_ Pet Services categories_ Pet Sitting categories_ Pet Stores categories_ Pet Training categories_ Pets categories_ Pharmacy categories_ Photo Booth Rentals categories_ Photographers categories_ Photography Stores & Services categories_ Physical Therapy categories_ Piano Bars categories_ Pick Your Own Farms categories_ Piercing categories_ Pilates categories_ Pita categories_ Pizza categories_ Playgrounds categories_ Plumbing categories_ Plus Size Fashion categories_ Poke categories_ Police Departments categories_ Polish categories_ Pool & Billiards categories_ Pool & Hot Tub Service categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Pop-up Shops categories_ Popcorn Shops categories_ Portuguese categories_ Post Offices categories_ Poutineries categories_ Preschools categories_ Pressure Washers categories_ Pretzels categories_ Printing Services categories_ Private Tutors categories_ Professional Services categories_ Property Management categories_ Psychics categories_ Psychologists categories_ Pub Food categories_ Public Markets categories_ Public Services & Government categories_ Public Transportation categories_ Pubs categories_ Puerto Rican categories_ Pumpkin Patches categories_ RV Parks categories_ RV Repair categories_ Ramen categories_ Real Estate categories_ Real Estate Agents categories_ Real Estate Services categories_ Recording & Rehearsal Studios categories_ Recreation Centers categories_ Reflexology categories_ Rehabilitation Center categories_ Reiki categories_ Religious Organizations categories_ Resorts categories_ Restaurant Supplies categories_ Restaurants categories_ Reunion categories_ Rock Climbing categories_ Roofing categories_ Rotisserie Chicken categories_ Russian categories_ Sailing categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Scottish categories_ Screen Printing categories_ Seafood categories_ Seafood Markets categories_ Security Services categories_ Security Systems categories_ Senegalese categories_ Septic Services categories_ Shanghainese categories_ Shared Office Spaces categories_ Shaved Ice categories_ Shaved Snow categories_ Shoe Stores categories_ Shopping categories_ Shopping Centers categories_ Sicilian categories_ Siding categories_ Signmaking categories_ Singaporean categories_ Ski Resorts categories_ Ski Schools categories_ Skin Care categories_ Skydiving categories_ Slovakian categories_ Smokehouse categories_ Soccer categories_ Social Clubs categories_ Software Development categories_ Soul Food categories_ Soup categories_ South African categories_ Southern categories_ Souvenir Shops categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Specialty Schools categories_ Sporting Goods categories_ Sports Bars categories_ Sports Clubs categories_ Sports Wear categories_ Squash categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Art categories_ Street Vendors categories_ Strip Clubs categories_ Studio Taping categories_ Sugar Shacks categories_ Summer Camps categories_ Supernatural Readings categories_ Supper Clubs categories_ Surf Schools categories_ Sushi Bars categories_ Swimming Pools categories_ Swimwear categories_ Swiss Food categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Tai Chi categories_ Taiwanese categories_ Tanning categories_ Tanning Beds categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tasting Classes categories_ Tattoo categories_ Tax Law categories_ Tax Services categories_ Taxis categories_ Tea Rooms categories_ Team Building Activities categories_ Tempura categories_ Tennis categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Thrift Stores categories_ Ticket Sales categories_ Tickets categories_ Tiki Bars categories_ Tires categories_ Tobacco Shops categories_ Tonkatsu categories_ Tours categories_ Towing categories_ Town Car Service categories_ Toy Stores categories_ Traditional Clothing categories_ Trainers categories_ Trampoline Parks categories_ Transmission Repair categories_ Transportation categories_ Travel Services categories_ Trinidadian categories_ Trophy Shops categories_ Truck Rental categories_ Trusts categories_ Turkish categories_ Tuscan categories_ Udon categories_ Ukrainian categories_ University Housing categories_ Unofficial Yelp Events categories_ Used categories_ Used Bookstore categories_ Uzbek categories_ Vacation Rentals categories_ Vape Shops categories_ Vegan categories_ Vegetarian categories_ Vehicle Wraps categories_ Venezuelan categories_ Venues & Event Spaces categories_ Veterinarians categories_ Videos & Video Game Rental categories_ Vietnamese categories_ Vintage & Consignment categories_ Vinyl Records categories_ Virtual Reality Centers categories_ Vitamins & Supplements categories_ Waffles categories_ Walking Tours categories_ Water Stores categories_ Waxing categories_ Web Design categories_ Wedding Chapels categories_ Wedding Planning categories_ Weight Loss Centers categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wholesalers categories_ Wigs categories_ Wills categories_ Window Washing categories_ Windows Installation categories_ Windshield Installation & Repair categories_ Wine & Spirits categories_ Wine Bars categories_ Wine Tasting Classes categories_ Wine Tasting Room categories_ Wine Tours categories_ Wineries categories_ Women's Clothing categories_ Wraps categories_ Yelp Events categories_ Yoga categories_ Zoos categories_Acai Bowls categories_Accessories categories_Active Life categories_Adult Entertainment categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Amusement Parks categories_Antiques categories_Appliances categories_Appliances & Repair categories_Aquariums categories_Arabian categories_Arcades categories_Archery categories_Argentine categories_Armenian categories_Art Galleries categories_Art Schools categories_Arts & Crafts categories_Arts & Entertainment categories_Asian Fusion categories_Australian categories_Austrian categories_Auto Repair categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Banks & Credit Unions categories_Bar Crawl categories_Barbeque categories_Bars categories_Bartenders categories_Basque categories_Beauty & Spas categories_Bed & Breakfast categories_Beer categories_Beer Bar categories_Beer Gardens categories_Belgian categories_Beverage Store categories_Bistros categories_Boat Charters categories_Books categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_Brewpubs categories_Bridal categories_British categories_Bubble Tea categories_Buffets categories_Building Supplies categories_Burgers categories_Burmese categories_Business Consulting categories_Butcher categories_CSA categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Calabrian categories_Cambodian categories_Canadian (New) categories_Candy Stores categories_Cantonese categories_Car Dealers categories_Car Wash categories_Caribbean categories_Casinos categories_Caterers categories_Champagne Bars categories_Cheese Shops categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Chocolatiers & Shops categories_Cideries categories_Cinema categories_Cocktail Bars categories_Coffee & Tea categories_Coffee & Tea Supplies categories_Coffee Roasteries categories_Colombian categories_Comedy Clubs categories_Comfort Food categories_Community Service/Non-Profit categories_Contractors categories_Convenience Stores categories_Conveyor Belt Sushi categories_Cooking Classes categories_Cooking Schools categories_Country Clubs categories_Couriers & Delivery Services categories_Courthouses categories_Creperies categories_Cuban categories_Cupcakes categories_Custom Cakes categories_Czech categories_Dance Clubs categories_Day Spas categories_Delicatessen categories_Delis categories_Dentists categories_Department Stores categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Discount Store categories_Dive Bars categories_Do-It-Yourself Food categories_Doctors categories_Dominican categories_Donairs categories_Donuts categories_Drugstores categories_Dry Cleaning & Laundry categories_Eatertainment categories_Education categories_Egyptian categories_Electricians categories_Electronics categories_Empanadas categories_Escape Games categories_Ethical Grocery categories_Ethiopian categories_Ethnic Food categories_Ethnic Grocery categories_Event Planning & Services categories_Falafel categories_Family Practice categories_Farmers Market categories_Fashion categories_Fast Food categories_Festivals categories_Filipino categories_Financial Services categories_Fish & Chips categories_Fitness & Instruction categories_Florists categories_Flowers & Gifts categories_Fondue categories_Food categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Tours categories_Food Trucks categories_French categories_Fruits & Veggies categories_Funeral Services & Cemeteries categories_Gardeners categories_Gas Stations categories_Gastropubs categories_Gelato categories_German categories_Gift Shops categories_Gluten-Free categories_Golf categories_Golf Equipment categories_Golf Equipment Shops categories_Greek categories_Grocery categories_Guamanian categories_Guest Houses categories_Gyms categories_Hair Salons categories_Hair Stylists categories_Haitian categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Health Markets categories_Herbs & Spices categories_Himalayan/Nepalese categories_Hobby Shops categories_Home & Garden categories_Home Cleaning categories_Home Decor categories_Home Services categories_Honduran categories_Hong Kong Style Cafe categories_Hookah Bars categories_Hospitals categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Hunting & Fishing Supplies categories_Iberian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_Interior Design categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Irish Pub categories_Italian categories_Izakaya categories_Japanese categories_Japanese Curry categories_Jazz & Blues categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Keys & Locksmiths categories_Kids Activities categories_Kitchen & Bath categories_Korean categories_Kosher categories_Lakes categories_Landmarks & Historical Buildings categories_Laotian categories_Laser Tag categories_Latin American categories_Laundry Services categories_Lawyers categories_Lebanese categories_Libraries categories_Life Coach categories_Limos categories_Live/Raw Food categories_Local Flavor categories_Local Services categories_Lounges categories_Macarons categories_Malaysian categories_Martial Arts categories_Massage categories_Massage Therapy categories_Mattresses categories_Meat Shops categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Mini Golf categories_Modern European categories_Mongolian categories_Moroccan categories_Movers categories_Museums categories_Music Venues categories_Nail Salons categories_Naturopathic/Holistic categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Nutritionists categories_Office Equipment categories_Organic Stores categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Parks categories_Party & Event Planning categories_Pasta Shops categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Peruvian categories_Pet Services categories_Pets categories_Pharmacy categories_Photography Stores & Services categories_Physical Therapy categories_Piano Bars categories_Pizza categories_Poke categories_Polish categories_Pool Halls categories_Pop-Up Restaurants categories_Portuguese categories_Poutineries categories_Pretzels categories_Professional Services categories_Public Markets categories_Public Services & Government categories_Pubs categories_Puerto Rican categories_RV Parks categories_Race Tracks categories_Ramen categories_Real Estate categories_Real Estate Agents categories_Recreation Centers categories_Reflexology categories_Resorts categories_Restaurants categories_Reunion categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Scottish categories_Seafood categories_Seafood Markets categories_Service Stations categories_Shanghainese categories_Shared Office Spaces categories_Shaved Ice categories_Shaved Snow categories_Shopping categories_Shopping Centers categories_Singaporean categories_Skating Rinks categories_Ski Resorts categories_Skin Care categories_Slovakian categories_Smokehouse categories_Soba categories_Soccer categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Speakeasies categories_Specialty Food categories_Specialty Schools categories_Sporting Goods categories_Sports Bars categories_Sports Clubs categories_Sports Wear categories_Spray Tanning categories_Sri Lankan categories_Stadiums & Arenas categories_Steakhouses categories_Street Vendors categories_Strip Clubs categories_Sugar Shacks categories_Supper Clubs categories_Sushi Bars categories_Swimming Pools categories_Swiss Food categories_Syrian categories_Szechuan categories_Tacos categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Teppanyaki categories_Tex-Mex categories_Thai categories_Themed Cafes categories_Tiki Bars categories_Tobacco Shops categories_Tours categories_Towing categories_Trainers categories_Transmission Repair categories_Turkish categories_Tuscan categories_Ukrainian categories_Used Bookstore categories_Vegan categories_Vegetarian categories_Vehicle Wraps categories_Venezuelan categories_Venues & Event Spaces categories_Veterinarians categories_Video Game Stores categories_Videos & Video Game Rental categories_Vietnamese categories_Vitamins & Supplements categories_Waffles categories_Waxing categories_Wedding Planning categories_Whiskey Bars categories_Wholesale Stores categories_Wholesalers categories_Wigs categories_Wine Bars categories_Wine Tasting Classes categories_Wine Tasting Room categories_Wine Tours categories_Wineries categories_Wraps categories_Yoga
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.622302 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 7778.0 51.049673 -114.079977 24 4.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.846154 18 0 1 8 4 4 4 13.0 4.222222 9.0 3.798692 11.168551 3.556721 5.00000 3.686094 3.777956 3.857143 3.684951 3.000000 5.000000 3.868171 2.000000 3.542997 5.000000 3.662669 3.749776 5.000000 3.66752 3.000000 5.000000 3.851784 2.000000 3.555679 5.000000 3.678871 3.770170 3.926822 3.676024 2.866472 5.000000 3.867280 2.000000 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 7778.0 51.049673 -114.079977 24 4.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.333333 18 0 0 1 1 1 454 3.0 3.333333 3.0 3.231620 2.702970 3.556721 3.79608 4.000000 3.777956 5.000000 2.500000 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 4.000000 3.749776 5.000000 2.50000 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 4.000000 3.770170 5.000000 2.353400 3.788204 3.928912 3.867280 3.767263 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 7778.0 51.049673 -114.079977 24 4.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 2.571429 27 0 0 31 3 3 46 14.0 2.692308 13.0 2.692835 12.704916 2.500000 1.00000 3.000000 3.777956 3.200000 3.684951 3.789846 1.000000 2.333333 3.770015 2.500000 1.000000 3.000000 3.749776 3.200000 3.66752 3.771654 1.000000 3.000000 3.744434 2.505665 1.000000 2.990709 3.770170 3.204491 3.676024 3.788204 1.000000 2.974502 3.767263 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 1 0.988395 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 7778.0 51.049673 -114.079977 24 4.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.703313 2 0 0 1 0 0 622 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 3267.0 43.841694 -79.399755 44 3.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.000000 2.980769 212 3 23 307 157 71 2902 104.0 3.000000 88.0 2.997169 88.020993 2.966667 4.00000 3.686094 2.000000 3.040000 2.600000 3.600000 2.500000 1.000000 3.750000 2.923077 4.000000 3.662669 1.666667 3.146341 2.60000 3.750000 2.500000 1.000000 3.666667 2.954478 4.000000 3.678871 1.826533 3.115334 2.630356 3.621347 2.500236 1.000000 3.828739 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
5 rPNt-m2pdt-OR_NMQnHjCQ tKIihU81IA3NjpsADuR-Tg --6MefnULPED_I942VcFNA 5 3 0 2 1 0.907921 4.400000 4.400000 4.395327 4.376569 4.405801 4.313458 1 3267.0 43.841694 -79.399755 44 3.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 4.111111 230 2 12 166 70 32 2326 9.0 4.111111 9.0 4.132161 8.602213 4.400000 3.79608 3.686094 3.777956 4.500000 3.684951 3.789846 1.000000 3.868171 5.000000 4.400000 3.763461 3.662669 3.749776 4.500000 3.66752 3.771654 1.000000 3.851784 5.000000 4.395327 3.789966 3.678871 3.770170 4.481197 3.676024 3.788204 1.000000 3.867280 5.000000 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
6 Kg582pH05mZO_E6WS8PrKA XNOs3Wz1Q_zdRgm1Hy05fg --6MefnULPED_I942VcFNA 1 2 2 1 -1 0.184327 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 3267.0 43.841694 -79.399755 44 3.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 1 0 0 2 1 2 1462 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
7 gkW6_UqV9b2XI_5ae8rBCg HSHuSCJvIvf_Tof62uZPEw --6MefnULPED_I942VcFNA 2 2 1 0 -1 0.687126 1.800000 1.692308 1.768813 1.816353 1.733574 1.724494 0 3267.0 43.841694 -79.399755 44 3.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 2.543478 87 1 5 96 34 65 2158 46.0 2.581395 43.0 2.580489 38.588153 1.800000 3.00000 3.686094 3.166667 2.875000 2.875000 3.000000 3.933014 1.000000 2.000000 1.692308 3.000000 3.662669 3.166667 3.142857 2.87500 3.000000 3.904608 1.000000 2.000000 1.768813 3.166279 3.678871 3.153689 3.005042 2.823432 2.991269 3.928912 1.000000 2.006375 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
8 02voOwsYf0cEdKNzt5IkwA yvpX68yurPsope6KhBZrYA --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 1 0.935335 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 23 0 0 23 8 2 286 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
9 M67I-I5ATaqtVLtKZTgygw gvh8bvei5vwfoIYbNIvNDQ --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 -1 0.242671 3.000000 3.000000 3.040677 3.007972 3.009842 3.062459 1 530.0 36.123183 -115.169190 1613 4.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 2.600000 21 0 0 6 0 3 4 5.0 2.666667 3.0 2.680455 3.015285 3.556721 2.00000 3.686094 3.777956 3.000000 2.500000 3.789846 3.933014 3.868171 3.770015 3.542997 2.000000 3.662669 3.749776 3.000000 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 2.000000 3.678871 3.770170 3.040677 2.562281 3.788204 3.928912 3.867280 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In [26]:
train_set.shape
Out[26]:
(558386, 2774)
In [27]:
train_set.to_pickle('../dataset/m2_n9/model_train_set_2.pickle')
In [28]:
test_set.head(10)
Out[28]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Agincourt city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Aliquippa city_Allison Park city_Ambridge city_Amherst city_Ange-Gardien city_Anjou city_Anthem city_Apache Junction city_Arnold city_Aspinwall city_Auburn Township city_Auburn Twp city_Aurora city_Avalon city_Avon city_Avon Lake city_Avondale city_Bainbridge city_Bainbridge Township city_Baldwin city_Balzac city_Bath city_Bay Village city_Beachwood city_Beaconsfield city_Beauharnois city_Bedford city_Bedford Heights city_Beeton city_Belle Vernon city_Belleville city_Bellevue city_Bellvue city_Belmont city_Beloeil city_Ben Avon city_Berea city_Bethel Park city_Black Earth city_Blainville city_Blakeney city_Blawnox city_Blue Diamond city_Bois-des-Filion city_Boisbriand city_Bolton city_Boucherville city_Boulder City city_Braddock city_Bradford city_Bradford West Gwillimbury city_Brampton city_Bratenahl city_Brecksville city_Brentwood city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklin city_Brookline city_Brooklyn city_Brookpark city_Brossard city_Brownsburg-Chatham city_Brunswick city_Brunswick Hills city_Buckeye city_Buena Vista city_Burton city_Caledon city_Caledon East city_Caledon Village city_Calgary city_Candiac city_Canonsburd city_Canonsburg city_Carefree city_Carnegie city_Castle Shannon city_Cave Creek city_Cecil city_Central city_Central City city_Central City Village city_Chagrin Falls city_Chambly city_Champaign city_Champlain city_Chandler city_Chardon city_Chargrin Falls city_Charlemagne city_Charlotte city_Chatauguay city_Chateauguay city_Chesterland city_Chestermere city_Cheswick city_Châteauguay city_Clairton city_Clark city_Clarkson city_Cleveland city_Cleveland Heights city_Cleveland Hghts. city_Clover city_Columbia Station city_Communauté-Urbaine-de-Montréal city_Concord city_Concord Mills city_Concord Township city_Copley city_Coraopolis city_Cornelius city_Cote Saint-Luc city_Coteau-du-Lac city_Cottage Grove city_Crafton city_Cramerton city_Creighton city_Cross Plains city_Cuddy city_Cuyahoga Falls city_Cuyahoga Fls city_Cuyahoga Heights city_Dallas city_Dane city_Davidson city_De Forest city_De Winton city_DeForest city_Deerfield city_Deforest city_Delson city_Denver city_Dollard-Des Ormeaux city_Dollard-Des-Ormeaux city_Dollard-des Ormeaux city_Dollard-des-Ormeaux city_Don Mills city_Dorval city_Downtown city_Dravosburg city_Duquesne city_ETOBICOKE city_East Ajax city_East Cleveland city_East Gwillimbury city_East Liberty city_East Mc Keesport city_East McKeesport city_East Pittsburgh city_East York city_Eastlake city_Edgewood city_El Mirage city_Elizabeth city_Elizabeth Township city_Elrama city_Elyria city_Emsworth city_Enterprise city_Estérel city_Etibicoke city_Etna city_Etobicoke city_Etobiicoke city_Euclid city_Export city_Fabreville city_Fairlawn city_Fairport Harbor city_Fairview Park city_Finleyville city_Fisher city_Fitchburg city_Forest Hills city_Fort McDowell city_Fort Mcdowell city_Fort Mill city_Fountain Hills city_GILBERT city_GOODYEAR city_Garfield Heights city_Gastonia city_Gates Mills city_Georgetown city_Gibsonia city_Gifford city_Gilbert city_Glassport city_Glen Williams city_Glendale city_Glendale Az city_Glenshaw city_Goodwood city_Goodyear city_Grafton city_Grand River city_Green Tree city_Greenfield Park city_Guadalupe city_Halton Hills city_Hampstead city_Hampton Township city_Harmarville city_Harrisbug city_Harrisburg city_Harrison City city_Harwick city_Heidelberg city_Hemmingford city_Henderson city_Hendersonville city_Henryville city_Herminie city_Highland Heights city_Higley city_Hinckley city_Holland Landing city_Homer city_Homestead city_Hudson city_Huntersville city_Huntingdon city_Hyland Heights city_Iberville city_Imperial city_Independence city_Indian Land city_Indian Trail city_Indian land city_Indianola city_Inglewood city_Irwin city_Jefferson Hills city_Joliette city_Kahnawake city_Kannapolis city_Kennedy Township city_Kent city_King city_King City city_Kirkland city_Kirtland city_Kleinburg city_L'Assomption city_L'ile-Perrot city_L'Île-Bizard city_L'Île-Perrot city_LAS VEGAS city_La Prairie city_La Salle city_LaGrange city_LaSalle city_Lachine city_Lachute city_Lagrange city_Lake Park city_Lake Wylie city_Lakewood city_Las Vegas city_Las Vegas city_LasVegas city_Lasalle city_Laval city_Laval, Ste Dorothee city_Laveen city_Laveen Village city_Lawrence city_Lawrenceville city_Leetsdale city_Les Coteaux city_Les Cèdres city_Library city_Lindale city_Litchfield city_Litchfield Park city_Locust city_Longueuil city_Lorain city_Lowell city_Lower Burrell city_Lower Lawrenceville city_Lower burrell city_Lowesville city_Lyndhurst city_MESA city_Macedonia city_Madison city_Mahomet city_Malton city_Mantua city_Maple city_Maple Heights city_Markham city_Marshall city_Mascouche city_Matthews city_Mayfield city_Mayfield Heights city_Mayfield Village city_Mc Donald city_Mc Farland city_Mc Kees Rocks city_Mc Murray city_McAdenville city_McCandless city_McCandless Township city_McDonald city_McFarland city_McKees Rocks city_McKeesport city_McMurray city_Mcfarland city_Mckees Rocks city_Mckeesport city_Mcknight city_Mcmurray city_Medina city_Medina Township city_Mentor city_Mentor On The Lake city_Mentor On the Lake city_Mentor-on-the-Lake city_Mercier city_Mesa city_Mesa AZ city_Middleburg Heights city_Middlefield city_Middleton city_Midland city_Midnapore city_Midway city_Millvale city_Milton city_Mint Hill city_Mint Hill city_Mirabel city_Missisauga city_Mississauga city_Mississuaga city_Monona city_Monongahela city_Monroe city_Monroeville city_Mont-Royal city_Mont-Saint-Grégoire city_Mont-Saint-Hilaire city_Montgomery city_Monticello city_Montreal city_Montreal-Est city_Montreal-Nord city_Montreal-West city_Montrose city_Montréal city_Montréal-Nord city_Montréal-Ouest city_Montréal-West city_Mooers city_Moon city_Moon TWP city_Moon Township city_Mooresville city_Moreland Hills city_Morin-Heights city_Moseley city_Mount Albert city_Mount Holly city_Mount Horeb city_Mount Lebanon city_Mount Oliver city_Mount Washington city_Mt Holly city_Mt Lebanon city_Mt. Holly city_Mt. Lebanon city_Munhall city_Munroe Falls city_Murrysville city_N Las Vegas city_N Ridgeville city_N. Las Vegas city_NELLIS AFB city_Napierville city_Nellis AFB city_Nellis Air Force Base city_Neville Island city_New Eagle city_New Kensington city_Newbury city_Newmarket city_Nobleton city_North York city_North Braddock city_North Huntingdon city_North Huntington city_North Las Vegas city_North Olmstead city_North Olmsted city_North Randall city_North Ridgeville city_North Royalton city_North Strabane Township city_North Toronto city_North Versailles city_North York city_Northfield city_Northfield Center Township city_Northwest Calgary city_Norton city_Oak Ridges city_Oakdale city_Oakland city_Oakmont city_Oakville city_Oakwood city_Oakwood Village city_Ogden city_Olmsted Falls city_Olmsted Township city_Omaha city_Orange city_Orange Village city_Oregon city_Outremont city_PHOENIX city_Painesville city_Painesville Township city_Palgrave city_Paoli city_Paradise city_Paradise Valley city_Parma city_Parma Heights city_Paw Creek city_Peninsula city_Penn Hills city_Penn Hills Township city_Peoria city_Pepper Pike city_Peters Township city_Phoenix city_Phx city_Pickering city_Pierrefonds city_Pincourt city_Pineville city_Pitcairn city_Pittsburgh city_Pleasant Hills city_Plum city_Plum Boro city_Pointe-Aux-Trembles city_Pointe-Calumet city_Pointe-Claire city_Port Credit city_Port Vue city_Presto city_Queen Creek city_Québec city_Ranlo city_Rantoul city_Ravenna city_Rawdon city_Repentigny city_Rexdale city_Richfield city_Richmond Height city_Richmond Heights city_Richmond Hil city_Richmond Hill city_Richmond Hts city_Rigaud city_Rillton city_Rio Verde city_River Drive Park city_Robinson city_Robinson Township city_Robinson Twp. city_Rock Hill city_Rocky River city_Rocky View city_Rocky View County city_Rocky View No. 44 city_Rosemere city_Rosemère city_Ross city_Ross Township city_Rostraver city_Rouses Point city_Rural Ridge city_Russellton city_SCOTTSDALE city_Sagamore Hills city_Saint Joseph city_Saint Laurent city_Saint-Basile-Le-Grand city_Saint-Basile-le-Grand city_Saint-Bruno city_Saint-Bruno-de-Montarville city_Saint-Constant city_Saint-Eustache city_Saint-Henri city_Saint-Hippolyte city_Saint-Hubert city_Saint-Hyacinthe city_Saint-Jean-sur-Richelieu city_Saint-Jerome city_Saint-Jérôme city_Saint-Lambert city_Saint-Laurent city_Saint-Lazare city_Saint-Leonard city_Saint-Léonard city_Saint-Marc-sur-Richelieu city_Saint-Sauveur city_Saint-laurent city_Sainte-Adele city_Sainte-Adèle city_Sainte-Anne-De-Bellevue city_Sainte-Anne-de-Bellevue city_Sainte-Catherine city_Sainte-Genevieve city_Sainte-Julie city_Sainte-Marguerite-Esterel city_Sainte-Marguerite-du-lac-Masson city_Sainte-Marthe city_Sainte-Thérèse city_Sainte-Thérèse-de-Blainville city_Saintt-Bruno-de-Montarville city_Salaberry-De-Valleyfield city_Salaberry-de-Valleyfield city_Sauk City city_Savoy city_Scarborough city_Scarobrough city_Schomberg city_Schottsdale city_Scottdale city_Scottsdale city_Seven Hills city_Sewickley city_Shaker Heights city_Shaler Township city_Sharpsburg city_Sheffield city_Sheffield Lake city_Sheffield Village city_Shorewood Hills city_Solon city_South Amherst city_South Euclid city_South Las Vegas city_South Park city_South Park Township city_Spring Valley city_Springdale city_St Joseph city_St Leonard city_St-Benoît de Mirabel city_St-Bruno-de-Montarville city_St-Jerome city_St-Leonard city_St. Léonard city_Stallings city_Stanley city_Ste-Rose city_Stouffville city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sturgeon city_Summerlin city_Sun City city_Sun City West city_Sun Lakes city_Sun Praiie city_Sun Prairie city_Sunrise Manor city_Surprise city_Swissvale city_THORNHILL city_TORONTO city_Tallmadge city_Tarentum city_Tega Cay city_Tempe city_Terrebonne city_Thorncliffe Park city_Thornhil city_Thornhill city_Tolleson city_Tolono city_Tornto city_Toronto city_Toronto Scarborough city_Tottenham city_Trafford city_Tremont city_Troy Township city_Turtle Creek city_Tuscola city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Upper St Clair city_Urbana city_Uxbridge city_Val-Morin city_Valley City city_Valley View city_Vaudreuil-Dorion city_Vaughan city_Venetia city_Venise-en-Québec city_Verdun city_Verona city_Ville Mont-Royal city_Vimont city_Waddell city_Wadsworth city_Walton Hills city_Warrensville Heights city_Warrensville Hts. city_Waterloo city_Waunakee city_Waxhaw city_Weddington city_Wesley Chapel city_West Elizabeth city_West Homestead city_West Mifflin city_West View city_Westlake city_Westmount city_Wexford city_Whiitby city_Whitby city_Whitchurch-Stouffville city_White Oak city_Wickliffe city_Wilkins Township city_Wilkinsburg city_Willoughby city_Willoughby Hills city_Willowdale city_Willowick city_Wilmerding city_Windsor city_Woodbridge city_Woodmere city_Woodmere Village city_York city_Youngtown city_cave creek city_charlotte city_kirtland city_lyndhurst city_solon city_springdale city_verdun city_Île-des-Soeurs Monday_Open_00:00:00 Monday_Open_01:00:00 Monday_Open_02:00:00 Monday_Open_03:00:00 Monday_Open_03:30:00 Monday_Open_04:00:00 Monday_Open_04:15:00 Monday_Open_04:30:00 Monday_Open_04:45:00 Monday_Open_05:00:00 Monday_Open_05:30:00 Monday_Open_05:45:00 Monday_Open_06:00:00 Monday_Open_06:30:00 Monday_Open_06:45:00 Monday_Open_07:00:00 Monday_Open_07:30:00 Monday_Open_07:45:00 Monday_Open_08:00:00 Monday_Open_08:15:00 Monday_Open_08:30:00 Monday_Open_09:00:00 Monday_Open_09:15:00 Monday_Open_09:30:00 Monday_Open_09:45:00 Monday_Open_10:00:00 Monday_Open_10:15:00 Monday_Open_10:30:00 Monday_Open_10:45:00 Monday_Open_11:00:00 Monday_Open_11:30:00 Monday_Open_11:45:00 Monday_Open_12:00:00 Monday_Open_12:15:00 Monday_Open_12:30:00 Monday_Open_12:45:00 Monday_Open_13:00:00 Monday_Open_13:15:00 Monday_Open_13:30:00 Monday_Open_14:00:00 Monday_Open_14:30:00 Monday_Open_15:00:00 Monday_Open_15:30:00 Monday_Open_16:00:00 Monday_Open_16:15:00 Monday_Open_16:30:00 Monday_Open_16:45:00 Monday_Open_17:00:00 Monday_Open_17:30:00 Monday_Open_18:00:00 Monday_Open_18:30:00 Monday_Open_19:00:00 Monday_Open_20:00:00 Monday_Open_20:30:00 Monday_Open_21:00:00 Monday_Open_22:00:00 Monday_Open_22:30:00 Monday_Open_23:00:00 Monday_Open_NaT Tuesday_Open_00:00:00 Tuesday_Open_01:00:00 Tuesday_Open_03:00:00 Tuesday_Open_03:30:00 Tuesday_Open_04:00:00 Tuesday_Open_04:15:00 Tuesday_Open_04:30:00 Tuesday_Open_04:45:00 Tuesday_Open_05:00:00 Tuesday_Open_05:30:00 Tuesday_Open_05:45:00 Tuesday_Open_06:00:00 Tuesday_Open_06:30:00 Tuesday_Open_06:45:00 Tuesday_Open_07:00:00 Tuesday_Open_07:30:00 Tuesday_Open_07:45:00 Tuesday_Open_08:00:00 Tuesday_Open_08:15:00 Tuesday_Open_08:30:00 Tuesday_Open_09:00:00 Tuesday_Open_09:30:00 Tuesday_Open_10:00:00 Tuesday_Open_10:15:00 Tuesday_Open_10:30:00 Tuesday_Open_10:45:00 Tuesday_Open_11:00:00 Tuesday_Open_11:15:00 Tuesday_Open_11:30:00 Tuesday_Open_11:45:00 Tuesday_Open_12:00:00 Tuesday_Open_12:15:00 Tuesday_Open_12:30:00 Tuesday_Open_13:00:00 Tuesday_Open_13:30:00 Tuesday_Open_14:00:00 Tuesday_Open_14:30:00 Tuesday_Open_15:00:00 Tuesday_Open_15:30:00 Tuesday_Open_16:00:00 Tuesday_Open_16:15:00 Tuesday_Open_16:30:00 Tuesday_Open_17:00:00 Tuesday_Open_17:15:00 Tuesday_Open_17:30:00 Tuesday_Open_18:00:00 Tuesday_Open_18:30:00 Tuesday_Open_19:00:00 Tuesday_Open_20:00:00 Tuesday_Open_20:30:00 Tuesday_Open_21:00:00 Tuesday_Open_21:15:00 Tuesday_Open_22:00:00 Tuesday_Open_22:30:00 Tuesday_Open_23:00:00 Tuesday_Open_NaT Wednesday_Open_00:00:00 Wednesday_Open_00:15:00 Wednesday_Open_01:00:00 Wednesday_Open_01:30:00 Wednesday_Open_03:00:00 Wednesday_Open_03:15:00 Wednesday_Open_03:30:00 Wednesday_Open_04:00:00 Wednesday_Open_04:15:00 Wednesday_Open_04:30:00 Wednesday_Open_04:45:00 Wednesday_Open_05:00:00 Wednesday_Open_05:30:00 Wednesday_Open_05:45:00 Wednesday_Open_06:00:00 Wednesday_Open_06:30:00 Wednesday_Open_06:45:00 Wednesday_Open_07:00:00 Wednesday_Open_07:30:00 Wednesday_Open_07:45:00 Wednesday_Open_08:00:00 Wednesday_Open_08:15:00 Wednesday_Open_08:30:00 Wednesday_Open_09:00:00 Wednesday_Open_09:15:00 Wednesday_Open_09:30:00 Wednesday_Open_10:00:00 Wednesday_Open_10:15:00 Wednesday_Open_10:30:00 Wednesday_Open_10:45:00 Wednesday_Open_11:00:00 Wednesday_Open_11:15:00 Wednesday_Open_11:30:00 Wednesday_Open_11:45:00 Wednesday_Open_12:00:00 Wednesday_Open_12:15:00 Wednesday_Open_12:30:00 Wednesday_Open_13:00:00 Wednesday_Open_13:30:00 Wednesday_Open_14:00:00 Wednesday_Open_14:30:00 Wednesday_Open_15:00:00 Wednesday_Open_15:30:00 Wednesday_Open_16:00:00 Wednesday_Open_16:15:00 Wednesday_Open_16:30:00 Wednesday_Open_17:00:00 Wednesday_Open_17:15:00 Wednesday_Open_17:30:00 Wednesday_Open_18:00:00 Wednesday_Open_18:30:00 Wednesday_Open_19:00:00 Wednesday_Open_19:30:00 Wednesday_Open_20:00:00 Wednesday_Open_20:30:00 Wednesday_Open_21:00:00 Wednesday_Open_22:00:00 Wednesday_Open_22:30:00 Wednesday_Open_23:00:00 Wednesday_Open_NaT Thursday_Open_00:00:00 Thursday_Open_00:15:00 Thursday_Open_01:00:00 Thursday_Open_02:00:00 Thursday_Open_03:00:00 Thursday_Open_03:30:00 Thursday_Open_04:00:00 Thursday_Open_04:15:00 Thursday_Open_04:30:00 Thursday_Open_04:45:00 Thursday_Open_05:00:00 Thursday_Open_05:30:00 Thursday_Open_05:45:00 Thursday_Open_06:00:00 Thursday_Open_06:30:00 Thursday_Open_06:45:00 Thursday_Open_07:00:00 Thursday_Open_07:30:00 Thursday_Open_07:45:00 Thursday_Open_08:00:00 Thursday_Open_08:15:00 Thursday_Open_08:30:00 Thursday_Open_09:00:00 Thursday_Open_09:15:00 Thursday_Open_09:30:00 Thursday_Open_10:00:00 Thursday_Open_10:15:00 Thursday_Open_10:30:00 Thursday_Open_10:45:00 Thursday_Open_11:00:00 Thursday_Open_11:15:00 Thursday_Open_11:30:00 Thursday_Open_11:45:00 Thursday_Open_12:00:00 Thursday_Open_12:15:00 Thursday_Open_12:30:00 Thursday_Open_13:00:00 Thursday_Open_13:30:00 Thursday_Open_14:00:00 Thursday_Open_14:30:00 Thursday_Open_15:00:00 Thursday_Open_15:30:00 Thursday_Open_16:00:00 Thursday_Open_16:15:00 Thursday_Open_16:30:00 Thursday_Open_17:00:00 Thursday_Open_17:15:00 Thursday_Open_17:30:00 Thursday_Open_17:45:00 Thursday_Open_18:00:00 Thursday_Open_18:30:00 Thursday_Open_19:00:00 Thursday_Open_19:30:00 Thursday_Open_20:00:00 Thursday_Open_20:30:00 Thursday_Open_21:00:00 Thursday_Open_21:30:00 Thursday_Open_22:00:00 Thursday_Open_22:30:00 Thursday_Open_23:00:00 Thursday_Open_23:30:00 Thursday_Open_NaT Friday_Open_00:00:00 Friday_Open_01:00:00 Friday_Open_03:00:00 Friday_Open_03:30:00 Friday_Open_04:00:00 Friday_Open_04:15:00 Friday_Open_04:30:00 Friday_Open_04:45:00 Friday_Open_05:00:00 Friday_Open_05:30:00 Friday_Open_05:45:00 Friday_Open_06:00:00 Friday_Open_06:30:00 Friday_Open_06:45:00 Friday_Open_07:00:00 Friday_Open_07:30:00 Friday_Open_07:45:00 Friday_Open_08:00:00 Friday_Open_08:15:00 Friday_Open_08:30:00 Friday_Open_09:00:00 Friday_Open_09:15:00 Friday_Open_09:30:00 Friday_Open_10:00:00 Friday_Open_10:15:00 Friday_Open_10:30:00 Friday_Open_10:45:00 Friday_Open_11:00:00 Friday_Open_11:15:00 Friday_Open_11:30:00 Friday_Open_11:45:00 Friday_Open_12:00:00 Friday_Open_12:15:00 Friday_Open_12:30:00 Friday_Open_13:00:00 Friday_Open_13:30:00 Friday_Open_14:00:00 Friday_Open_14:30:00 Friday_Open_15:00:00 Friday_Open_15:30:00 Friday_Open_16:00:00 Friday_Open_16:15:00 Friday_Open_16:30:00 Friday_Open_17:00:00 Friday_Open_17:15:00 Friday_Open_17:30:00 Friday_Open_18:00:00 Friday_Open_18:30:00 Friday_Open_19:00:00 Friday_Open_19:30:00 Friday_Open_20:00:00 Friday_Open_20:30:00 Friday_Open_21:00:00 Friday_Open_22:00:00 Friday_Open_22:30:00 Friday_Open_23:00:00 Friday_Open_23:30:00 Friday_Open_NaT Saturday_Open_00:00:00 Saturday_Open_01:00:00 Saturday_Open_02:30:00 Saturday_Open_03:00:00 Saturday_Open_03:30:00 Saturday_Open_04:00:00 Saturday_Open_04:30:00 Saturday_Open_04:45:00 Saturday_Open_05:00:00 Saturday_Open_05:30:00 Saturday_Open_05:45:00 Saturday_Open_06:00:00 Saturday_Open_06:30:00 Saturday_Open_06:45:00 Saturday_Open_07:00:00 Saturday_Open_07:30:00 Saturday_Open_08:00:00 Saturday_Open_08:15:00 Saturday_Open_08:30:00 Saturday_Open_09:00:00 Saturday_Open_09:30:00 Saturday_Open_10:00:00 Saturday_Open_10:15:00 Saturday_Open_10:30:00 Saturday_Open_10:45:00 Saturday_Open_11:00:00 Saturday_Open_11:15:00 Saturday_Open_11:30:00 Saturday_Open_11:45:00 Saturday_Open_12:00:00 Saturday_Open_12:15:00 Saturday_Open_12:30:00 Saturday_Open_13:00:00 Saturday_Open_13:30:00 Saturday_Open_14:00:00 Saturday_Open_14:30:00 Saturday_Open_15:00:00 Saturday_Open_15:30:00 Saturday_Open_16:00:00 Saturday_Open_16:15:00 Saturday_Open_16:30:00 Saturday_Open_17:00:00 Saturday_Open_17:30:00 Saturday_Open_18:00:00 Saturday_Open_18:15:00 Saturday_Open_18:30:00 Saturday_Open_19:00:00 Saturday_Open_19:30:00 Saturday_Open_19:45:00 Saturday_Open_20:00:00 Saturday_Open_20:30:00 Saturday_Open_21:00:00 Saturday_Open_21:15:00 Saturday_Open_21:30:00 Saturday_Open_22:00:00 Saturday_Open_22:30:00 Saturday_Open_23:00:00 Saturday_Open_23:30:00 Saturday_Open_NaT Sunday_Open_00:00:00 Sunday_Open_00:30:00 Sunday_Open_01:00:00 Sunday_Open_02:00:00 Sunday_Open_02:30:00 Sunday_Open_03:00:00 Sunday_Open_03:30:00 Sunday_Open_04:00:00 Sunday_Open_04:30:00 Sunday_Open_04:45:00 Sunday_Open_05:00:00 Sunday_Open_05:30:00 Sunday_Open_05:45:00 Sunday_Open_06:00:00 Sunday_Open_06:30:00 Sunday_Open_06:45:00 Sunday_Open_07:00:00 Sunday_Open_07:30:00 Sunday_Open_07:45:00 Sunday_Open_08:00:00 Sunday_Open_08:15:00 Sunday_Open_08:30:00 Sunday_Open_09:00:00 Sunday_Open_09:30:00 Sunday_Open_09:45:00 Sunday_Open_10:00:00 Sunday_Open_10:15:00 Sunday_Open_10:30:00 Sunday_Open_10:45:00 Sunday_Open_11:00:00 Sunday_Open_11:15:00 Sunday_Open_11:30:00 Sunday_Open_11:45:00 Sunday_Open_12:00:00 Sunday_Open_12:15:00 Sunday_Open_12:30:00 Sunday_Open_13:00:00 Sunday_Open_13:15:00 Sunday_Open_13:30:00 Sunday_Open_14:00:00 Sunday_Open_14:30:00 Sunday_Open_15:00:00 Sunday_Open_15:30:00 Sunday_Open_16:00:00 Sunday_Open_16:15:00 Sunday_Open_16:30:00 Sunday_Open_17:00:00 Sunday_Open_17:15:00 Sunday_Open_17:30:00 Sunday_Open_18:00:00 Sunday_Open_18:30:00 Sunday_Open_19:00:00 Sunday_Open_19:30:00 Sunday_Open_20:00:00 Sunday_Open_20:30:00 Sunday_Open_21:00:00 Sunday_Open_21:30:00 Sunday_Open_22:00:00 Sunday_Open_22:30:00 Sunday_Open_23:00:00 Sunday_Open_23:30:00 Sunday_Open_NaT Monday_Close_00:00:00 Monday_Close_00:15:00 Monday_Close_00:30:00 Monday_Close_01:00:00 Monday_Close_01:10:00 Monday_Close_01:15:00 Monday_Close_01:30:00 Monday_Close_01:45:00 Monday_Close_02:00:00 Monday_Close_02:30:00 Monday_Close_02:45:00 Monday_Close_03:00:00 Monday_Close_03:30:00 Monday_Close_03:59:00 Monday_Close_04:00:00 Monday_Close_04:30:00 Monday_Close_04:45:00 Monday_Close_05:00:00 Monday_Close_05:30:00 Monday_Close_06:00:00 Monday_Close_06:30:00 Monday_Close_07:00:00 Monday_Close_08:00:00 Monday_Close_08:30:00 Monday_Close_09:00:00 Monday_Close_09:30:00 Monday_Close_10:00:00 Monday_Close_10:15:00 Monday_Close_11:00:00 Monday_Close_11:15:00 Monday_Close_11:30:00 Monday_Close_12:00:00 Monday_Close_12:30:00 Monday_Close_13:00:00 Monday_Close_13:30:00 Monday_Close_14:00:00 Monday_Close_14:30:00 Monday_Close_14:45:00 Monday_Close_15:00:00 Monday_Close_15:30:00 Monday_Close_16:00:00 Monday_Close_16:30:00 Monday_Close_16:45:00 Monday_Close_17:00:00 Monday_Close_17:15:00 Monday_Close_17:30:00 Monday_Close_18:00:00 Monday_Close_18:30:00 Monday_Close_19:00:00 Monday_Close_19:30:00 Monday_Close_19:45:00 Monday_Close_20:00:00 Monday_Close_20:30:00 Monday_Close_20:35:00 Monday_Close_20:45:00 Monday_Close_21:00:00 Monday_Close_21:15:00 Monday_Close_21:30:00 Monday_Close_21:45:00 Monday_Close_22:00:00 Monday_Close_22:15:00 Monday_Close_22:30:00 Monday_Close_22:45:00 Monday_Close_23:00:00 Monday_Close_23:15:00 Monday_Close_23:30:00 Monday_Close_23:45:00 Monday_Close_23:59:00 Monday_Close_NaT Tuesday_Close_00:00:00 Tuesday_Close_00:15:00 Tuesday_Close_00:30:00 Tuesday_Close_00:45:00 Tuesday_Close_01:00:00 Tuesday_Close_01:10:00 Tuesday_Close_01:15:00 Tuesday_Close_01:30:00 Tuesday_Close_01:45:00 Tuesday_Close_02:00:00 Tuesday_Close_02:30:00 Tuesday_Close_03:00:00 Tuesday_Close_03:30:00 Tuesday_Close_03:59:00 Tuesday_Close_04:00:00 Tuesday_Close_04:30:00 Tuesday_Close_04:45:00 Tuesday_Close_05:00:00 Tuesday_Close_05:30:00 Tuesday_Close_06:00:00 Tuesday_Close_07:00:00 Tuesday_Close_08:00:00 Tuesday_Close_08:30:00 Tuesday_Close_09:00:00 Tuesday_Close_09:30:00 Tuesday_Close_10:00:00 Tuesday_Close_10:15:00 Tuesday_Close_10:30:00 Tuesday_Close_11:00:00 Tuesday_Close_11:30:00 Tuesday_Close_12:00:00 Tuesday_Close_12:30:00 Tuesday_Close_12:45:00 Tuesday_Close_13:00:00 Tuesday_Close_13:30:00 Tuesday_Close_14:00:00 Tuesday_Close_14:30:00 Tuesday_Close_14:45:00 Tuesday_Close_15:00:00 Tuesday_Close_15:30:00 Tuesday_Close_15:45:00 Tuesday_Close_16:00:00 Tuesday_Close_16:30:00 Tuesday_Close_16:45:00 Tuesday_Close_17:00:00 Tuesday_Close_17:15:00 Tuesday_Close_17:30:00 Tuesday_Close_18:00:00 Tuesday_Close_18:30:00 Tuesday_Close_19:00:00 Tuesday_Close_19:30:00 Tuesday_Close_19:45:00 Tuesday_Close_20:00:00 Tuesday_Close_20:15:00 Tuesday_Close_20:30:00 Tuesday_Close_20:35:00 Tuesday_Close_20:45:00 Tuesday_Close_21:00:00 Tuesday_Close_21:15:00 Tuesday_Close_21:30:00 Tuesday_Close_21:45:00 Tuesday_Close_22:00:00 Tuesday_Close_22:15:00 Tuesday_Close_22:30:00 Tuesday_Close_22:45:00 Tuesday_Close_23:00:00 Tuesday_Close_23:30:00 Tuesday_Close_23:45:00 Tuesday_Close_23:59:00 Tuesday_Close_NaT Wednesday_Close_00:00:00 Wednesday_Close_00:15:00 Wednesday_Close_00:30:00 Wednesday_Close_00:45:00 Wednesday_Close_01:00:00 Wednesday_Close_01:10:00 Wednesday_Close_01:15:00 Wednesday_Close_01:30:00 Wednesday_Close_01:45:00 Wednesday_Close_02:00:00 Wednesday_Close_02:30:00 Wednesday_Close_02:45:00 Wednesday_Close_03:00:00 Wednesday_Close_03:30:00 Wednesday_Close_03:59:00 Wednesday_Close_04:00:00 Wednesday_Close_04:30:00 Wednesday_Close_04:45:00 Wednesday_Close_05:00:00 Wednesday_Close_05:30:00 Wednesday_Close_06:00:00 Wednesday_Close_07:00:00 Wednesday_Close_07:45:00 Wednesday_Close_08:00:00 Wednesday_Close_08:30:00 Wednesday_Close_09:00:00 Wednesday_Close_10:00:00 Wednesday_Close_10:30:00 Wednesday_Close_11:00:00 Wednesday_Close_11:15:00 Wednesday_Close_11:30:00 Wednesday_Close_12:00:00 Wednesday_Close_12:30:00 Wednesday_Close_12:45:00 Wednesday_Close_13:00:00 Wednesday_Close_13:30:00 Wednesday_Close_14:00:00 Wednesday_Close_14:30:00 Wednesday_Close_14:45:00 Wednesday_Close_15:00:00 Wednesday_Close_15:30:00 Wednesday_Close_16:00:00 Wednesday_Close_16:30:00 Wednesday_Close_17:00:00 Wednesday_Close_17:30:00 Wednesday_Close_18:00:00 Wednesday_Close_18:30:00 Wednesday_Close_18:45:00 Wednesday_Close_19:00:00 Wednesday_Close_19:30:00 Wednesday_Close_19:45:00 Wednesday_Close_20:00:00 Wednesday_Close_20:15:00 Wednesday_Close_20:30:00 Wednesday_Close_20:35:00 Wednesday_Close_20:45:00 Wednesday_Close_21:00:00 Wednesday_Close_21:15:00 Wednesday_Close_21:30:00 Wednesday_Close_21:45:00 Wednesday_Close_22:00:00 Wednesday_Close_22:15:00 Wednesday_Close_22:30:00 Wednesday_Close_22:45:00 Wednesday_Close_23:00:00 Wednesday_Close_23:15:00 Wednesday_Close_23:30:00 Wednesday_Close_23:45:00 Wednesday_Close_23:59:00 Wednesday_Close_NaT Thursday_Close_00:00:00 Thursday_Close_00:15:00 Thursday_Close_00:30:00 Thursday_Close_00:45:00 Thursday_Close_01:00:00 Thursday_Close_01:10:00 Thursday_Close_01:15:00 Thursday_Close_01:30:00 Thursday_Close_01:45:00 Thursday_Close_02:00:00 Thursday_Close_02:30:00 Thursday_Close_02:45:00 Thursday_Close_03:00:00 Thursday_Close_03:30:00 Thursday_Close_03:59:00 Thursday_Close_04:00:00 Thursday_Close_04:15:00 Thursday_Close_04:30:00 Thursday_Close_04:45:00 Thursday_Close_05:00:00 Thursday_Close_05:30:00 Thursday_Close_06:00:00 Thursday_Close_07:00:00 Thursday_Close_07:45:00 Thursday_Close_08:00:00 Thursday_Close_08:30:00 Thursday_Close_09:00:00 Thursday_Close_09:30:00 Thursday_Close_10:00:00 Thursday_Close_10:30:00 Thursday_Close_11:00:00 Thursday_Close_11:15:00 Thursday_Close_11:30:00 Thursday_Close_11:45:00 Thursday_Close_12:00:00 Thursday_Close_12:30:00 Thursday_Close_12:45:00 Thursday_Close_13:00:00 Thursday_Close_13:30:00 Thursday_Close_14:00:00 Thursday_Close_14:30:00 Thursday_Close_14:45:00 Thursday_Close_15:00:00 Thursday_Close_15:15:00 Thursday_Close_15:30:00 Thursday_Close_16:00:00 Thursday_Close_16:30:00 Thursday_Close_17:00:00 Thursday_Close_17:15:00 Thursday_Close_17:30:00 Thursday_Close_17:45:00 Thursday_Close_18:00:00 Thursday_Close_18:15:00 Thursday_Close_18:30:00 Thursday_Close_18:45:00 Thursday_Close_19:00:00 Thursday_Close_19:30:00 Thursday_Close_19:45:00 Thursday_Close_20:00:00 Thursday_Close_20:15:00 Thursday_Close_20:30:00 Thursday_Close_20:45:00 Thursday_Close_21:00:00 Thursday_Close_21:15:00 Thursday_Close_21:30:00 Thursday_Close_21:35:00 Thursday_Close_21:45:00 Thursday_Close_22:00:00 Thursday_Close_22:15:00 Thursday_Close_22:30:00 Thursday_Close_22:45:00 Thursday_Close_23:00:00 Thursday_Close_23:15:00 Thursday_Close_23:30:00 Thursday_Close_23:45:00 Thursday_Close_23:59:00 Thursday_Close_NaT Friday_Close_00:00:00 Friday_Close_00:15:00 Friday_Close_00:30:00 Friday_Close_00:45:00 Friday_Close_01:00:00 Friday_Close_01:10:00 Friday_Close_01:20:00 Friday_Close_01:30:00 Friday_Close_01:45:00 Friday_Close_02:00:00 Friday_Close_02:15:00 Friday_Close_02:30:00 Friday_Close_02:45:00 Friday_Close_03:00:00 Friday_Close_03:30:00 Friday_Close_03:45:00 Friday_Close_03:59:00 Friday_Close_04:00:00 Friday_Close_04:15:00 Friday_Close_04:30:00 Friday_Close_04:45:00 Friday_Close_05:00:00 Friday_Close_05:30:00 Friday_Close_06:00:00 Friday_Close_06:30:00 Friday_Close_07:00:00 Friday_Close_07:45:00 Friday_Close_08:00:00 Friday_Close_08:30:00 Friday_Close_09:00:00 Friday_Close_10:00:00 Friday_Close_10:15:00 Friday_Close_10:30:00 Friday_Close_11:00:00 Friday_Close_11:15:00 Friday_Close_11:30:00 Friday_Close_12:00:00 Friday_Close_12:15:00 Friday_Close_12:30:00 Friday_Close_12:45:00 Friday_Close_13:00:00 Friday_Close_13:30:00 Friday_Close_14:00:00 Friday_Close_14:30:00 Friday_Close_14:45:00 Friday_Close_15:00:00 Friday_Close_15:30:00 Friday_Close_16:00:00 Friday_Close_16:30:00 Friday_Close_16:45:00 Friday_Close_17:00:00 Friday_Close_17:15:00 Friday_Close_17:30:00 Friday_Close_18:00:00 Friday_Close_18:30:00 Friday_Close_19:00:00 Friday_Close_19:30:00 Friday_Close_19:45:00 Friday_Close_20:00:00 Friday_Close_20:15:00 Friday_Close_20:30:00 Friday_Close_20:45:00 Friday_Close_21:00:00 Friday_Close_21:15:00 Friday_Close_21:30:00 Friday_Close_21:35:00 Friday_Close_21:45:00 Friday_Close_22:00:00 Friday_Close_22:15:00 Friday_Close_22:30:00 Friday_Close_22:45:00 Friday_Close_23:00:00 Friday_Close_23:15:00 Friday_Close_23:30:00 Friday_Close_23:45:00 Friday_Close_23:59:00 Friday_Close_NaT Saturday_Close_00:00:00 Saturday_Close_00:15:00 Saturday_Close_00:30:00 Saturday_Close_00:45:00 Saturday_Close_01:00:00 Saturday_Close_01:10:00 Saturday_Close_01:20:00 Saturday_Close_01:30:00 Saturday_Close_01:45:00 Saturday_Close_02:00:00 Saturday_Close_02:15:00 Saturday_Close_02:30:00 Saturday_Close_02:45:00 Saturday_Close_03:00:00 Saturday_Close_03:30:00 Saturday_Close_03:45:00 Saturday_Close_03:59:00 Saturday_Close_04:00:00 Saturday_Close_04:15:00 Saturday_Close_04:30:00 Saturday_Close_04:45:00 Saturday_Close_05:00:00 Saturday_Close_05:30:00 Saturday_Close_06:00:00 Saturday_Close_06:30:00 Saturday_Close_07:00:00 Saturday_Close_08:00:00 Saturday_Close_08:30:00 Saturday_Close_09:00:00 Saturday_Close_10:00:00 Saturday_Close_10:30:00 Saturday_Close_11:00:00 Saturday_Close_11:15:00 Saturday_Close_11:30:00 Saturday_Close_11:45:00 Saturday_Close_12:00:00 Saturday_Close_12:30:00 Saturday_Close_12:45:00 Saturday_Close_13:00:00 Saturday_Close_13:30:00 Saturday_Close_14:00:00 Saturday_Close_14:30:00 Saturday_Close_14:45:00 Saturday_Close_15:00:00 Saturday_Close_15:30:00 Saturday_Close_15:45:00 Saturday_Close_16:00:00 Saturday_Close_16:30:00 Saturday_Close_16:45:00 Saturday_Close_17:00:00 Saturday_Close_17:30:00 Saturday_Close_17:45:00 Saturday_Close_18:00:00 Saturday_Close_18:30:00 Saturday_Close_19:00:00 Saturday_Close_19:15:00 Saturday_Close_19:30:00 Saturday_Close_19:45:00 Saturday_Close_20:00:00 Saturday_Close_20:30:00 Saturday_Close_20:45:00 Saturday_Close_21:00:00 Saturday_Close_21:15:00 Saturday_Close_21:30:00 Saturday_Close_21:35:00 Saturday_Close_21:45:00 Saturday_Close_22:00:00 Saturday_Close_22:15:00 Saturday_Close_22:30:00 Saturday_Close_22:45:00 Saturday_Close_23:00:00 Saturday_Close_23:15:00 Saturday_Close_23:30:00 Saturday_Close_23:45:00 Saturday_Close_23:59:00 Saturday_Close_NaT Sunday_Close_00:00:00 Sunday_Close_00:15:00 Sunday_Close_00:30:00 Sunday_Close_01:00:00 Sunday_Close_01:10:00 Sunday_Close_01:15:00 Sunday_Close_01:30:00 Sunday_Close_01:45:00 Sunday_Close_02:00:00 Sunday_Close_02:15:00 Sunday_Close_02:30:00 Sunday_Close_03:00:00 Sunday_Close_03:30:00 Sunday_Close_03:59:00 Sunday_Close_04:00:00 Sunday_Close_04:30:00 Sunday_Close_04:45:00 Sunday_Close_05:00:00 Sunday_Close_05:30:00 Sunday_Close_06:00:00 Sunday_Close_07:00:00 Sunday_Close_08:00:00 Sunday_Close_08:30:00 Sunday_Close_09:00:00 Sunday_Close_09:30:00 Sunday_Close_10:00:00 Sunday_Close_10:30:00 Sunday_Close_11:00:00 Sunday_Close_11:15:00 Sunday_Close_11:30:00 Sunday_Close_12:00:00 Sunday_Close_12:15:00 Sunday_Close_12:30:00 Sunday_Close_12:45:00 Sunday_Close_13:00:00 Sunday_Close_13:30:00 Sunday_Close_14:00:00 Sunday_Close_14:30:00 Sunday_Close_14:45:00 Sunday_Close_15:00:00 Sunday_Close_15:30:00 Sunday_Close_15:45:00 Sunday_Close_16:00:00 Sunday_Close_16:30:00 Sunday_Close_16:45:00 Sunday_Close_17:00:00 Sunday_Close_17:30:00 Sunday_Close_18:00:00 Sunday_Close_18:15:00 Sunday_Close_18:30:00 Sunday_Close_19:00:00 Sunday_Close_19:30:00 Sunday_Close_19:45:00 Sunday_Close_20:00:00 Sunday_Close_20:30:00 Sunday_Close_20:35:00 Sunday_Close_20:45:00 Sunday_Close_21:00:00 Sunday_Close_21:15:00 Sunday_Close_21:30:00 Sunday_Close_21:45:00 Sunday_Close_22:00:00 Sunday_Close_22:15:00 Sunday_Close_22:30:00 Sunday_Close_22:45:00 Sunday_Close_23:00:00 Sunday_Close_23:15:00 Sunday_Close_23:30:00 Sunday_Close_23:45:00 Sunday_Close_23:59:00 Sunday_Close_NaT categories_ & Probates categories_ Acai Bowls categories_ Accessories categories_ Accountants categories_ Active Life categories_ Acupuncture categories_ Adult Education categories_ Adult Entertainment categories_ Advertising categories_ Afghan categories_ African categories_ Air Duct Cleaning categories_ Aircraft Repairs categories_ Airport Lounges categories_ Airport Shuttles categories_ Airport Terminals categories_ Airports categories_ Airsoft categories_ Amateur Sports Teams categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Assisted Therapy categories_ Animal Physical Therapy categories_ Animal Shelters categories_ Antiques categories_ Apartments categories_ Appliances categories_ Appliances & Repair categories_ Aquarium Services categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Classes categories_ Art Galleries categories_ Art Museums categories_ Art Schools categories_ Art Supplies categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Audio/Visual Equipment Rental categories_ Australian categories_ Austrian categories_ Auto Customization categories_ Auto Detailing categories_ Auto Glass Services categories_ Auto Insurance categories_ Auto Parts & Supplies categories_ Auto Repair categories_ Auto Upholstery categories_ Automotive categories_ Baby Gear & Furniture categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Bankruptcy Law categories_ Bar Crawl categories_ Barbeque categories_ Barbers categories_ Bars categories_ Bartenders categories_ Basque categories_ Batting Cages categories_ Beach Bars categories_ Beaches categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Beer Bar categories_ Beer Garden categories_ Beer Gardens categories_ Beer Hall categories_ Belgian categories_ Bespoke Clothing categories_ Beverage Store categories_ Bike Rentals categories_ Bike Repair/Maintenance categories_ Bikes categories_ Bingo Halls categories_ Bistros categories_ Blow Dry/Out Services categories_ Boat Charters categories_ Boat Dealers categories_ Boat Repair categories_ Boating categories_ Bocce Ball categories_ Body Shops categories_ Books categories_ Bookstores categories_ Botanical Gardens categories_ Bounce House Rentals categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Brazilian Jiu-jitsu categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ Bridal categories_ British categories_ Bubble Tea categories_ Buffets categories_ Building Supplies categories_ Burgers categories_ Burmese categories_ Bus Tours categories_ Business Consulting categories_ Butcher categories_ CSA categories_ Cabaret categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Campgrounds categories_ Canadian (New) categories_ Candy Stores categories_ Cannabis Clinics categories_ Cannabis Collective categories_ Cannabis Dispensaries categories_ Cantonese categories_ Car Dealers categories_ Car Rental categories_ Car Share Services categories_ Car Wash categories_ Car Window Tinting categories_ Cardiologists categories_ Cards & Stationery categories_ Caribbean categories_ Carpet Installation categories_ Casinos categories_ Caterers categories_ Champagne Bars categories_ Check Cashing/Pay-day Loans categories_ Cheese Shops categories_ Cheese Tasting Classes categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chiropractors categories_ Chocolatiers & Shops categories_ Christmas Trees categories_ Churches categories_ Churros categories_ Cideries categories_ Cigar Bars categories_ Cinema categories_ Climbing categories_ Clothing Rental categories_ Clowns categories_ Club Crawl categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee & Tea Supplies categories_ Coffee Roasteries categories_ Coffeeshops categories_ Colleges & Universities categories_ Colombian categories_ Comedy Clubs categories_ Comfort Food categories_ Comic Books categories_ Commercial Truck Repair categories_ Community Centers categories_ Community Service/Non-Profit categories_ Computers categories_ Contractors categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Cooking Schools categories_ Cosmetic Dentists categories_ Cosmetics & Beauty Supply categories_ Counseling & Mental Health categories_ Country Clubs categories_ Country Dance Halls categories_ Couriers & Delivery Services categories_ Creperies categories_ Cuban categories_ Cultural Center categories_ Cupcakes categories_ Currency Exchange categories_ Custom Cakes categories_ Czech categories_ DJs categories_ Dance Clubs categories_ Dance Schools categories_ Day Camps categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Dentists categories_ Department Stores categories_ Desserts categories_ Diagnostic Services categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Discount Store categories_ Distilleries categories_ Dive Bars categories_ Divorce & Family Law categories_ Do-It-Yourself Food categories_ Doctors categories_ Dog Walkers categories_ Dominican categories_ Donairs categories_ Donuts categories_ Door Sales/Installation categories_ Drive-Thru Bars categories_ Drugstores categories_ Dry Cleaning categories_ Dry Cleaning & Laundry categories_ Drywall Installation & Repair categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Electronics categories_ Emergency Medicine categories_ Empanadas categories_ Employment Agencies categories_ Engraving categories_ Escape Games categories_ Estate Planning Law categories_ Ethical Grocery categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyebrow Services categories_ Eyelash Service categories_ Eyewear & Opticians categories_ Falafel categories_ Farmers Market categories_ Farms categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Financial Services categories_ Fireplace Services categories_ Fish & Chips categories_ Fitness & Instruction categories_ Flea Markets categories_ Flooring categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food categories_ Food Banks categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ Formal Wear categories_ Foundation Repair categories_ French categories_ Fruits & Veggies categories_ Fur Clothing categories_ Furniture Rental categories_ Furniture Repair categories_ Furniture Reupholstery categories_ Furniture Stores categories_ Game Meat categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ General Dentistry categories_ German categories_ Gift Shops categories_ Glass & Mirrors categories_ Gluten-Free categories_ Golf categories_ Golf Cart Dealers categories_ Golf Lessons categories_ Graphic Design categories_ Greek categories_ Grilling Equipment categories_ Grocery categories_ Guamanian categories_ Guest Houses categories_ Gun/Rifle Ranges categories_ Gutter Services categories_ Gyms categories_ Hainan categories_ Hair Extensions categories_ Hair Removal categories_ Hair Salons categories_ Hair Stylists categories_ Haitian categories_ Hakka categories_ Halal categories_ Handyman categories_ Hardware Stores categories_ Hats categories_ Hawaiian categories_ Head Shops categories_ Health & Medical categories_ Health Markets categories_ Health Retreats categories_ Heating & Air Conditioning/HVAC categories_ Herbs & Spices categories_ Himalayan/Nepalese categories_ Historical Tours categories_ Hobby Shops categories_ Holiday Decorations categories_ Holistic Animal Care categories_ Home & Garden categories_ Home Cleaning categories_ Home Decor categories_ Home Health Care categories_ Home Services categories_ Home Window Tinting categories_ Honduran categories_ Honey categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Horse Racing categories_ Horseback Riding categories_ Hospitals categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hot Tub & Pool categories_ Hotel bar categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Iberian categories_ Ice Cream & Frozen Yogurt categories_ Ice Delivery categories_ Immigration Law categories_ Imported Food categories_ Indian categories_ Indonesian categories_ Indoor Playcentre categories_ Insurance categories_ Interior Design categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Irish Pub categories_ Italian categories_ Izakaya categories_ Japanese categories_ Japanese Curry categories_ Jazz & Blues categories_ Jewelry categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kids Hair Salons categories_ Kitchen & Bath categories_ Knife Sharpening categories_ Kombucha categories_ Korean categories_ Kosher categories_ Laboratory Testing categories_ Lakes categories_ Landmarks & Historical Buildings categories_ Landscaping categories_ Laotian categories_ Laser Hair Removal categories_ Laser Tag categories_ Latin American categories_ Laundromat categories_ Laundry Services categories_ Lawyers categories_ Leather Goods categories_ Lebanese categories_ Leisure Centers categories_ Libraries categories_ Life Coach categories_ Lighting Fixtures & Equipment categories_ Limos categories_ Live/Raw Food categories_ Local Fish Stores categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Magicians categories_ Mags categories_ Makeup Artists categories_ Malaysian categories_ Marinas categories_ Marketing categories_ Martial Arts categories_ Masonry/Concrete categories_ Mass Media categories_ Massage categories_ Massage Therapy categories_ Mattresses categories_ Mauritius categories_ Meat Shops categories_ Medical Cannabis Referrals categories_ Medical Centers categories_ Medical Spas categories_ Meditation Centers categories_ Mediterranean categories_ Men's Clothing categories_ Mexican categories_ Middle Eastern categories_ Minho categories_ Mini Golf categories_ Mobile Phones categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Mortgage Brokers categories_ Motorcycle Repair categories_ Movers categories_ Museums categories_ Music & DVDs categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nicaraguan categories_ Nightlife categories_ Noodles categories_ Nurseries & Gardening categories_ Nutritionists categories_ Oaxacan categories_ Observatories categories_ Occupational Therapy categories_ Officiants categories_ Oil Change Stations categories_ Olive Oil categories_ Opera & Ballet categories_ Ophthalmologists categories_ Optometrists categories_ Organic Stores categories_ Outdoor Furniture Stores categories_ Outdoor Gear categories_ Outlet Stores categories_ Paint & Sip categories_ Paint-Your-Own Pottery categories_ Painters categories_ Pakistani categories_ Pan Asian categories_ Parenting Classes categories_ Parks categories_ Party & Event Planning categories_ Party Bus Rentals categories_ Party Equipment Rentals categories_ Party Supplies categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Pawn Shops categories_ Pediatricians categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Assistants categories_ Personal Chefs categories_ Personal Injury Law categories_ Personal Shopping categories_ Peruvian categories_ Pet Adoption categories_ Pet Boarding categories_ Pet Groomers categories_ Pet Services categories_ Pet Sitting categories_ Pet Stores categories_ Pet Training categories_ Pets categories_ Pharmacy categories_ Photo Booth Rentals categories_ Photographers categories_ Photography Stores & Services categories_ Physical Therapy categories_ Piano Bars categories_ Pick Your Own Farms categories_ Piercing categories_ Pilates categories_ Pita categories_ Pizza categories_ Playgrounds categories_ Plumbing categories_ Plus Size Fashion categories_ Poke categories_ Police Departments categories_ Polish categories_ Pool & Billiards categories_ Pool & Hot Tub Service categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Pop-up Shops categories_ Popcorn Shops categories_ Portuguese categories_ Post Offices categories_ Poutineries categories_ Preschools categories_ Pressure Washers categories_ Pretzels categories_ Printing Services categories_ Private Tutors categories_ Professional Services categories_ Property Management categories_ Psychics categories_ Psychologists categories_ Pub Food categories_ Public Markets categories_ Public Services & Government categories_ Public Transportation categories_ Pubs categories_ Puerto Rican categories_ Pumpkin Patches categories_ RV Parks categories_ RV Repair categories_ Ramen categories_ Real Estate categories_ Real Estate Agents categories_ Real Estate Services categories_ Recording & Rehearsal Studios categories_ Recreation Centers categories_ Reflexology categories_ Rehabilitation Center categories_ Reiki categories_ Religious Organizations categories_ Resorts categories_ Restaurant Supplies categories_ Restaurants categories_ Reunion categories_ Rock Climbing categories_ Roofing categories_ Rotisserie Chicken categories_ Russian categories_ Sailing categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Scottish categories_ Screen Printing categories_ Seafood categories_ Seafood Markets categories_ Security Services categories_ Security Systems categories_ Senegalese categories_ Septic Services categories_ Shanghainese categories_ Shared Office Spaces categories_ Shaved Ice categories_ Shaved Snow categories_ Shoe Stores categories_ Shopping categories_ Shopping Centers categories_ Sicilian categories_ Siding categories_ Signmaking categories_ Singaporean categories_ Ski Resorts categories_ Ski Schools categories_ Skin Care categories_ Skydiving categories_ Slovakian categories_ Smokehouse categories_ Soccer categories_ Social Clubs categories_ Software Development categories_ Soul Food categories_ Soup categories_ South African categories_ Southern categories_ Souvenir Shops categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Specialty Schools categories_ Sporting Goods categories_ Sports Bars categories_ Sports Clubs categories_ Sports Wear categories_ Squash categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Art categories_ Street Vendors categories_ Strip Clubs categories_ Studio Taping categories_ Sugar Shacks categories_ Summer Camps categories_ Supernatural Readings categories_ Supper Clubs categories_ Surf Schools categories_ Sushi Bars categories_ Swimming Pools categories_ Swimwear categories_ Swiss Food categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Tai Chi categories_ Taiwanese categories_ Tanning categories_ Tanning Beds categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tasting Classes categories_ Tattoo categories_ Tax Law categories_ Tax Services categories_ Taxis categories_ Tea Rooms categories_ Team Building Activities categories_ Tempura categories_ Tennis categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Thrift Stores categories_ Ticket Sales categories_ Tickets categories_ Tiki Bars categories_ Tires categories_ Tobacco Shops categories_ Tonkatsu categories_ Tours categories_ Towing categories_ Town Car Service categories_ Toy Stores categories_ Traditional Clothing categories_ Trainers categories_ Trampoline Parks categories_ Transmission Repair categories_ Transportation categories_ Travel Services categories_ Trinidadian categories_ Trophy Shops categories_ Truck Rental categories_ Trusts categories_ Turkish categories_ Tuscan categories_ Udon categories_ Ukrainian categories_ University Housing categories_ Unofficial Yelp Events categories_ Used categories_ Used Bookstore categories_ Uzbek categories_ Vacation Rentals categories_ Vape Shops categories_ Vegan categories_ Vegetarian categories_ Vehicle Wraps categories_ Venezuelan categories_ Venues & Event Spaces categories_ Veterinarians categories_ Videos & Video Game Rental categories_ Vietnamese categories_ Vintage & Consignment categories_ Vinyl Records categories_ Virtual Reality Centers categories_ Vitamins & Supplements categories_ Waffles categories_ Walking Tours categories_ Water Stores categories_ Waxing categories_ Web Design categories_ Wedding Chapels categories_ Wedding Planning categories_ Weight Loss Centers categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wholesalers categories_ Wigs categories_ Wills categories_ Window Washing categories_ Windows Installation categories_ Windshield Installation & Repair categories_ Wine & Spirits categories_ Wine Bars categories_ Wine Tasting Classes categories_ Wine Tasting Room categories_ Wine Tours categories_ Wineries categories_ Women's Clothing categories_ Wraps categories_ Yelp Events categories_ Yoga categories_ Zoos categories_Acai Bowls categories_Accessories categories_Active Life categories_Adult Entertainment categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Amusement Parks categories_Antiques categories_Appliances categories_Appliances & Repair categories_Aquariums categories_Arabian categories_Arcades categories_Archery categories_Argentine categories_Armenian categories_Art Galleries categories_Art Schools categories_Arts & Crafts categories_Arts & Entertainment categories_Asian Fusion categories_Australian categories_Austrian categories_Auto Repair categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Banks & Credit Unions categories_Bar Crawl categories_Barbeque categories_Bars categories_Bartenders categories_Basque categories_Beauty & Spas categories_Bed & Breakfast categories_Beer categories_Beer Bar categories_Beer Gardens categories_Belgian categories_Beverage Store categories_Bistros categories_Boat Charters categories_Books categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_Brewpubs categories_Bridal categories_British categories_Bubble Tea categories_Buffets categories_Building Supplies categories_Burgers categories_Burmese categories_Business Consulting categories_Butcher categories_CSA categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Calabrian categories_Cambodian categories_Canadian (New) categories_Candy Stores categories_Cantonese categories_Car Dealers categories_Car Wash categories_Caribbean categories_Casinos categories_Caterers categories_Champagne Bars categories_Cheese Shops categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Chocolatiers & Shops categories_Cideries categories_Cinema categories_Cocktail Bars categories_Coffee & Tea categories_Coffee & Tea Supplies categories_Coffee Roasteries categories_Colombian categories_Comedy Clubs categories_Comfort Food categories_Community Service/Non-Profit categories_Contractors categories_Convenience Stores categories_Conveyor Belt Sushi categories_Cooking Classes categories_Cooking Schools categories_Country Clubs categories_Couriers & Delivery Services categories_Courthouses categories_Creperies categories_Cuban categories_Cupcakes categories_Custom Cakes categories_Czech categories_Dance Clubs categories_Day Spas categories_Delicatessen categories_Delis categories_Dentists categories_Department Stores categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Discount Store categories_Dive Bars categories_Do-It-Yourself Food categories_Doctors categories_Dominican categories_Donairs categories_Donuts categories_Drugstores categories_Dry Cleaning & Laundry categories_Eatertainment categories_Education categories_Egyptian categories_Electricians categories_Electronics categories_Empanadas categories_Escape Games categories_Ethical Grocery categories_Ethiopian categories_Ethnic Food categories_Ethnic Grocery categories_Event Planning & Services categories_Falafel categories_Family Practice categories_Farmers Market categories_Fashion categories_Fast Food categories_Festivals categories_Filipino categories_Financial Services categories_Fish & Chips categories_Fitness & Instruction categories_Florists categories_Flowers & Gifts categories_Fondue categories_Food categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Tours categories_Food Trucks categories_French categories_Fruits & Veggies categories_Funeral Services & Cemeteries categories_Gardeners categories_Gas Stations categories_Gastropubs categories_Gelato categories_German categories_Gift Shops categories_Gluten-Free categories_Golf categories_Golf Equipment categories_Golf Equipment Shops categories_Greek categories_Grocery categories_Guamanian categories_Guest Houses categories_Gyms categories_Hair Salons categories_Hair Stylists categories_Haitian categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Health Markets categories_Herbs & Spices categories_Himalayan/Nepalese categories_Hobby Shops categories_Home & Garden categories_Home Cleaning categories_Home Decor categories_Home Services categories_Honduran categories_Hong Kong Style Cafe categories_Hookah Bars categories_Hospitals categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Hunting & Fishing Supplies categories_Iberian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_Interior Design categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Irish Pub categories_Italian categories_Izakaya categories_Japanese categories_Japanese Curry categories_Jazz & Blues categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Keys & Locksmiths categories_Kids Activities categories_Kitchen & Bath categories_Korean categories_Kosher categories_Lakes categories_Landmarks & Historical Buildings categories_Laotian categories_Laser Tag categories_Latin American categories_Laundry Services categories_Lawyers categories_Lebanese categories_Libraries categories_Life Coach categories_Limos categories_Live/Raw Food categories_Local Flavor categories_Local Services categories_Lounges categories_Macarons categories_Malaysian categories_Martial Arts categories_Massage categories_Massage Therapy categories_Mattresses categories_Meat Shops categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Mini Golf categories_Modern European categories_Mongolian categories_Moroccan categories_Movers categories_Museums categories_Music Venues categories_Nail Salons categories_Naturopathic/Holistic categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Nutritionists categories_Office Equipment categories_Organic Stores categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Parks categories_Party & Event Planning categories_Pasta Shops categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Peruvian categories_Pet Services categories_Pets categories_Pharmacy categories_Photography Stores & Services categories_Physical Therapy categories_Piano Bars categories_Pizza categories_Poke categories_Polish categories_Pool Halls categories_Pop-Up Restaurants categories_Portuguese categories_Poutineries categories_Pretzels categories_Professional Services categories_Public Markets categories_Public Services & Government categories_Pubs categories_Puerto Rican categories_RV Parks categories_Race Tracks categories_Ramen categories_Real Estate categories_Real Estate Agents categories_Recreation Centers categories_Reflexology categories_Resorts categories_Restaurants categories_Reunion categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Scottish categories_Seafood categories_Seafood Markets categories_Service Stations categories_Shanghainese categories_Shared Office Spaces categories_Shaved Ice categories_Shaved Snow categories_Shopping categories_Shopping Centers categories_Singaporean categories_Skating Rinks categories_Ski Resorts categories_Skin Care categories_Slovakian categories_Smokehouse categories_Soba categories_Soccer categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Speakeasies categories_Specialty Food categories_Specialty Schools categories_Sporting Goods categories_Sports Bars categories_Sports Clubs categories_Sports Wear categories_Spray Tanning categories_Sri Lankan categories_Stadiums & Arenas categories_Steakhouses categories_Street Vendors categories_Strip Clubs categories_Sugar Shacks categories_Supper Clubs categories_Sushi Bars categories_Swimming Pools categories_Swiss Food categories_Syrian categories_Szechuan categories_Tacos categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Teppanyaki categories_Tex-Mex categories_Thai categories_Themed Cafes categories_Tiki Bars categories_Tobacco Shops categories_Tours categories_Towing categories_Trainers categories_Transmission Repair categories_Turkish categories_Tuscan categories_Ukrainian categories_Used Bookstore categories_Vegan categories_Vegetarian categories_Vehicle Wraps categories_Venezuelan categories_Venues & Event Spaces categories_Veterinarians categories_Video Game Stores categories_Videos & Video Game Rental categories_Vietnamese categories_Vitamins & Supplements categories_Waffles categories_Waxing categories_Wedding Planning categories_Whiskey Bars categories_Wholesale Stores categories_Wholesalers categories_Wigs categories_Wine Bars categories_Wine Tasting Classes categories_Wine Tasting Room categories_Wine Tours categories_Wineries categories_Wraps categories_Yoga
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 1 0.997555 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 3267.0 43.841694 -79.399755 44 3.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 67 1 1 16 6 2 3574 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 -1 0.553523 1.800000 2.000000 1.799679 1.838975 2.007105 1.777964 1 3267.0 43.841694 -79.399755 44 3.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 2.875000 86 0 1 34 9 12 166 16.0 3.000000 15.0 2.909282 13.563981 1.800000 2.20000 3.686094 4.500000 3.333333 1.000000 3.000000 3.933014 3.868171 3.770015 2.000000 2.500000 3.662669 4.500000 3.333333 1.00000 3.000000 3.904608 3.851784 3.744434 1.799679 2.460018 3.678871 4.571695 3.355656 1.000000 2.924220 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 1 0.990602 4.300000 4.333333 4.299574 4.349620 4.302949 4.288981 1 3267.0 43.841694 -79.399755 44 3.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 4.108108 227 2 7 99 47 30 286 37.0 4.161290 31.0 4.130733 31.326167 4.300000 3.75000 3.686094 3.500000 4.454545 3.666667 3.800000 3.933014 4.000000 4.000000 4.333333 3.750000 3.662669 3.000000 4.555556 3.50000 4.000000 3.904608 4.000000 4.000000 4.299574 3.724926 3.678871 3.340936 4.538601 3.626374 3.946442 3.928912 4.00000 4.000000 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 1 0.968214 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 3267.0 43.841694 -79.399755 44 3.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 1 0 0 0 0 0 2110 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 1 0.995667 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 59 0 0 5 0 1 22 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
5 NYG9z-whhsV99RbDR4KPWQ 3l-Rmqcw_Cm1mTxlqEmLEQ --9e1ONYQuAa-CB_Rrw7Tw 5 1 0 5 1 0.123990 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 17 0 0 7 7 1 4 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
6 14tBa-RMPWBP93y6KcOyQQ 2vjNw6qpyvXAqRhSPzmHtQ --9e1ONYQuAa-CB_Rrw7Tw 2 1 0 0 1 0.530155 3.000000 3.704594 3.000000 2.995295 3.722907 3.009147 0 530.0 36.123183 -115.169190 1613 4.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 2.500000 105 0 0 31 12 19 4 2.0 2.000000 1.0 2.258306 1.292847 3.556721 3.79608 3.686094 3.777956 3.000000 2.000000 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 2.00000 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.000000 2.000000 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
7 A-n5xtGMR5Frz2KPJTfRzw WiY9q-Jz42huWzq90fgAWA --9e1ONYQuAa-CB_Rrw7Tw 2 0 0 0 -1 0.474595 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 530.0 36.123183 -115.169190 1613 4.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 4 0 1 0 0 0 4 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
8 DqmGICsMu16YttevGUZCjg q9q9nVaTYz7tScwZLHNO3A --9e1ONYQuAa-CB_Rrw7Tw 5 0 1 0 1 0.900049 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 5 0 0 2 0 1 4 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
9 pSHJpti6SfYIK3XhHzvrXg q_sv4HEU4XM88x9z6WG-Tw --9e1ONYQuAa-CB_Rrw7Tw 2 0 0 0 -1 0.487348 4.000000 4.000000 4.000000 4.001269 4.001652 4.015008 0 530.0 36.123183 -115.169190 1613 4.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.000000 10 0 0 4 3 0 22 2.0 4.000000 1.0 3.820433 0.773019 3.556721 3.79608 3.686094 2.000000 4.000000 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 4.000000 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 2.000000 4.000000 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In [29]:
test_set.shape
Out[29]:
(153993, 2774)
In [30]:
test_set.to_pickle('../dataset/m2_n9/model_test_set_2.pickle')
In [31]:
_del_all()

5.3.2 Dimensionality reduction version

We summarize what kind of data we have at the moment, in order to decide what to do with each feature.

In [47]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set.pickle')
train_set.head()
Out[47]:
review_id user_id business_id stars_review useful_review funny_review cool_review date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count user_name average_stars_user yelping_since review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-01-11 19:55:31 1 0.622302 NaN NaN NaN NaN NaN NaN 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Jexie 3.846154 2009-06-24 18 0 1 8 4 4 4 13.0 4.222222 9.0 3.798692 11.168551 NaN 5.0 NaN NaN 3.857143 NaN 3.0 5.0 NaN 2.00 NaN 5.0 NaN NaN 5.000000 NaN 3.00 5.0 NaN 2.000000 NaN 5.0 NaN NaN 3.926822 NaN 2.866472 5.000000 NaN 2.000000
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 2018-02-25 17:47:12 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Alex 3.333333 2015-07-27 18 0 0 1 1 1 454 3.0 3.333333 3.0 3.231620 2.702970 NaN NaN 4.0 NaN 5.000000 2.5 NaN NaN NaN NaN NaN NaN 4.0 NaN 5.000000 2.5 NaN NaN NaN NaN NaN NaN 4.000000 NaN 5.000000 2.353400 NaN NaN NaN NaN
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 2018-05-06 04:22:48 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Mermaid 2.571429 2012-01-01 27 0 0 31 3 3 46 14.0 2.692308 13.0 2.692835 12.704916 2.500000 1.0 3.0 NaN 3.200000 NaN NaN 1.0 2.333333 NaN 2.500000 1.0 3.0 NaN 3.200000 NaN NaN 1.0 3.0 NaN 2.505665 1.0 2.990709 NaN 3.204491 NaN NaN 1.000000 2.974502 NaN
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 2018-04-22 17:42:09 1 0.988395 NaN NaN NaN NaN NaN NaN 0 The Spicy Amigos 821 4 Avenue SW Mexican T2P 0K5 51.049673 -114.079977 24 4.0 True NaN False NaN NaN Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 NaN Jen 3.703313 2013-02-27 2 0 0 1 0 0 622 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 2018-05-21 05:09:07 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.0 Alex 2.980769 2008-11-15 212 3 23 307 157 71 2902 104.0 3.000000 88.0 2.997169 88.020993 2.966667 4.0 NaN 2.0 3.040000 2.6 3.6 2.5 1.000000 3.75 2.923077 4.0 NaN 1.666667 3.146341 2.6 3.75 2.5 1.0 3.666667 2.954478 4.0 NaN 1.826533 3.115334 2.630356 3.621347 2.500236 1.000000 3.828739
In [48]:
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set.pickle')
test_set.head()
Out[48]:
review_id user_id business_id stars_review useful_review funny_review cool_review date bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes name address cuisine postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count user_name average_stars_user yelping_since review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 2018-10-21 18:45:39 1 0.997555 NaN NaN NaN NaN NaN NaN 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Fran 3.703313 2017-01-09 67 1 1 16 6 2 3574 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-21 01:07:38 -1 0.553523 1.8 2.000000 1.799679 1.838975 2.007105 1.777964 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Leung 2.875000 2016-07-20 86 0 1 34 9 12 166 16.0 3.000000 15.0 2.909282 13.563981 1.8 2.20 NaN 4.5 3.333333 1.000000 3.0 NaN NaN NaN 2.000000 2.50 NaN 4.5 3.333333 1.0 3.0 NaN NaN NaN 1.799679 2.460018 NaN 4.571695 3.355656 1.000000 2.924220 NaN NaN NaN
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 2018-09-09 03:20:03 1 0.990602 4.3 4.333333 4.299574 4.349620 4.302949 4.288981 1 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Jo 4.108108 2017-08-08 227 2 7 99 47 30 286 37.0 4.161290 31.0 4.130733 31.326167 4.3 3.75 NaN 3.5 4.454545 3.666667 3.8 NaN 4.0 4.0 4.333333 3.75 NaN 3.0 4.555556 3.5 4.0 NaN 4.0 4.0 4.299574 3.724926 NaN 3.340936 4.538601 3.626374 3.946442 NaN 4.0 4.0
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 2018-10-04 01:37:05 1 0.968214 NaN NaN NaN NaN NaN NaN 0 John's Chinese BBQ Restaurant 328 Highway 7 E, Chalmers Gate 11, Unit 10 Chinese L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 NaN Cindy 3.703313 2016-03-09 1 0 0 0 0 0 2110 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 2018-11-06 19:48:01 1 0.995667 NaN NaN NaN NaN NaN NaN 1 Delmonico Steakhouse 3355 Las Vegas Blvd S Others 89109 36.123183 -115.169190 1613 4.0 False True False True No Full_Bar Cajun/Creole, Seafood, Steakhouses, Restaurants Las Vegas 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 17:00:00 22:00:00 22:00:00 22:00:00 22:00:00 22:30:00 22:30:00 22:00:00 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 NaN Katie 3.703313 2017-06-15 59 0 0 5 0 1 22 0.0 3.703313 0.0 3.703313 0.000000 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
In [49]:
train_test_set = _pd.concat([train_set, test_set], sort=False)
In [50]:
print("train size:", train_set.shape)
print("test size:", test_set.shape)
print("train_test size:", train_test_set.shape)
print(train_set.shape[0] + test_set.shape[0] == train_test_set.shape[0])
_train_len = train_set.shape[0]
train size: (558386, 99)
test size: (153993, 99)
train_test size: (712379, 99)
True

Drop useless features

In [54]:
train_test_set.drop(columns=['date', 'name', 'address', 'yelping_since', 'user_name', 'cuisine'], inplace=True)

Fill missing values

In [55]:
train_test_set['OutdoorSeating'] = train_test_set['OutdoorSeating'].fillna('None')
train_test_set['BusinessAcceptsCreditCards'] = train_test_set['BusinessAcceptsCreditCards'].fillna('None')
train_test_set['RestaurantsDelivery'] = train_test_set['RestaurantsDelivery'].fillna('None')
train_test_set['RestaurantsReservations'] = train_test_set['RestaurantsReservations'].fillna('None')
train_test_set['WiFi'] = train_test_set['WiFi'].fillna('None')
train_test_set['Alcohol'] = train_test_set['Alcohol'].fillna('None')
In [56]:
train_test_set['Monday_Open'] = train_test_set["Monday_Open"].astype(str)
train_test_set['Monday_Open'] = train_test_set['Monday_Open'].fillna(train_test_set['Monday_Open'].mode())
train_test_set['Tuesday_Open'] = train_test_set["Tuesday_Open"].astype(str)
train_test_set['Tuesday_Open'] = train_test_set['Tuesday_Open'].fillna(train_test_set['Tuesday_Open'].mode())
train_test_set['Wednesday_Open'] = train_test_set["Wednesday_Open"].astype(str)
train_test_set['Wednesday_Open'] = train_test_set['Wednesday_Open'].fillna(train_test_set['Wednesday_Open'].mode())
train_test_set['Thursday_Open'] = train_test_set["Thursday_Open"].astype(str)
train_test_set['Thursday_Open'] = train_test_set['Thursday_Open'].fillna(train_test_set['Thursday_Open'].mode())
train_test_set['Friday_Open'] = train_test_set["Friday_Open"].astype(str)
train_test_set['Friday_Open'] = train_test_set['Friday_Open'].fillna(train_test_set['Friday_Open'].mode())
train_test_set['Saturday_Open'] = train_test_set["Saturday_Open"].astype(str)
train_test_set['Saturday_Open'] = train_test_set['Saturday_Open'].fillna(train_test_set['Saturday_Open'].mode())
train_test_set['Sunday_Open'] = train_test_set["Sunday_Open"].astype(str)
train_test_set['Sunday_Open'] = train_test_set['Sunday_Open'].fillna(train_test_set['Sunday_Open'].mode())
train_test_set['Monday_Close'] = train_test_set["Monday_Close"].astype(str)
train_test_set['Monday_Close'] = train_test_set['Monday_Close'].fillna(train_test_set['Monday_Close'].mode())
train_test_set['Tuesday_Close'] = train_test_set["Tuesday_Close"].astype(str)
train_test_set['Tuesday_Close'] = train_test_set['Tuesday_Close'].fillna(train_test_set['Tuesday_Close'].mode())
train_test_set['Wednesday_Close'] = train_test_set["Wednesday_Close"].astype(str)
train_test_set['Wednesday_Close'] = train_test_set['Wednesday_Close'].fillna(train_test_set['Wednesday_Close'].mode())
train_test_set['Thursday_Close'] = train_test_set["Thursday_Close"].astype(str)
train_test_set['Thursday_Close'] = train_test_set['Thursday_Close'].fillna(train_test_set['Thursday_Close'].mode())
train_test_set['Friday_Close'] = train_test_set["Friday_Close"].astype(str)
train_test_set['Friday_Close'] = train_test_set['Friday_Close'].fillna(train_test_set['Friday_Close'].mode())
train_test_set['Saturday_Close'] = train_test_set["Saturday_Close"].astype(str)
train_test_set['Saturday_Close'] = train_test_set['Saturday_Close'].fillna(train_test_set['Saturday_Close'].mode())
train_test_set['Sunday_Close'] = train_test_set["Sunday_Close"].astype(str)
train_test_set['Sunday_Close'] = train_test_set['Sunday_Close'].fillna(train_test_set['Sunday_Close'].mode())
In [57]:
for ind, dtype in train_test_types.iteritems():
    if _np.issubdtype(dtype, _np.floating):
        train_test_set[ind] = train_test_set[ind].fillna(train_test_set[ind].mean())
    elif _np.issubdtype(dtype, _np.integer):
        train_test_set[ind] = train_test_set[ind].fillna(round(train_test_set[ind].mean()))
In [58]:
# check any feature still has null values
train_test_set.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 712379 entries, 0 to 153992
Data columns (total 93 columns):
review_id                            712379 non-null object
user_id                              712379 non-null object
business_id                          712379 non-null object
stars_review                         712379 non-null int64
useful_review                        712379 non-null int64
funny_review                         712379 non-null int64
cool_review                          712379 non-null int64
bin_truth_score                      712379 non-null int64
real_truth_score                     712379 non-null float64
cuisine_av_hist                      712379 non-null float64
cuisine_av_hist_bin                  712379 non-null float64
cuisine_av_hist_real                 712379 non-null float64
coll_score                           712379 non-null float64
coll_score_bin                       712379 non-null float64
coll_score_real                      712379 non-null float64
likes                                712379 non-null int32
postal_code                          712379 non-null object
latitude                             712379 non-null float64
longitude                            712379 non-null float64
review_count                         712379 non-null int64
stars_restaurant                     712379 non-null float64
OutdoorSeating                       712379 non-null object
BusinessAcceptsCreditCards           712379 non-null object
RestaurantsDelivery                  712379 non-null object
RestaurantsReservations              712379 non-null object
WiFi                                 712379 non-null object
Alcohol                              712379 non-null object
categories                           712379 non-null object
city                                 712379 non-null object
Monday_Open                          712379 non-null object
Tuesday_Open                         712379 non-null object
Wednesday_Open                       712379 non-null object
Thursday_Open                        712379 non-null object
Friday_Open                          712379 non-null object
Saturday_Open                        712379 non-null object
Sunday_Open                          712379 non-null object
Monday_Close                         712379 non-null object
Tuesday_Close                        712379 non-null object
Wednesday_Close                      712379 non-null object
Thursday_Close                       712379 non-null object
Friday_Close                         712379 non-null object
Saturday_Close                       712379 non-null object
Sunday_Close                         712379 non-null object
average_stars_review                 712379 non-null float64
num_reviews_review                   712379 non-null float64
average_stars_bin_review             712379 non-null float64
num_reviews_bin_review               712379 non-null float64
average_stars_real_review            712379 non-null float64
num_reviews_real_review              712379 non-null float64
compliment_count                     712379 non-null float64
average_stars_user                   712379 non-null float64
review                               712379 non-null int64
years_of_elite                       712379 non-null int64
fans                                 712379 non-null int64
useful_user                          712379 non-null int64
cool_user                            712379 non-null int64
funny_user                           712379 non-null int64
friends                              712379 non-null int64
num_reviews_user                     712379 non-null float64
average_stars_bin_user               712379 non-null float64
num_reviews_bin_user                 712379 non-null float64
average_stars_real_user              712379 non-null float64
num_reviews_real_user                712379 non-null float64
av_rat_chinese_cuisine               712379 non-null float64
av_rat_japanese_cuisine              712379 non-null float64
av_rat_mexican_cuisine               712379 non-null float64
av_rat_italian_cuisine               712379 non-null float64
av_rat_others_cuisine                712379 non-null float64
av_rat_american_cuisine              712379 non-null float64
av_rat_korean_cuisine                712379 non-null float64
av_rat_mediterranean_cuisine         712379 non-null float64
av_rat_thai_cuisine                  712379 non-null float64
av_rat_asianfusion_cuisine           712379 non-null float64
av_rat_chinese_cuisine_bin           712379 non-null float64
av_rat_japanese_cuisine_bin          712379 non-null float64
av_rat_mexican_cuisine_bin           712379 non-null float64
av_rat_italian_cuisine_bin           712379 non-null float64
av_rat_others_cuisine_bin            712379 non-null float64
av_rat_american_cuisine_bin          712379 non-null float64
av_rat_korean_cuisine_bin            712379 non-null float64
av_rat_mediterranean_cuisine_bin     712379 non-null float64
av_rat_thai_cuisine_bin              712379 non-null float64
av_rat_asianfusion_cuisine_bin       712379 non-null float64
av_rat_chinese_cuisine_real          712379 non-null float64
av_rat_japanese_cuisine_real         712379 non-null float64
av_rat_mexican_cuisine_real          712379 non-null float64
av_rat_italian_cuisine_real          712379 non-null float64
av_rat_others_cuisine_real           712379 non-null float64
av_rat_american_cuisine_real         712379 non-null float64
av_rat_korean_cuisine_real           712379 non-null float64
av_rat_mediterranean_cuisine_real    712379 non-null float64
av_rat_thai_cuisine_real             712379 non-null float64
av_rat_asianfusion_cuisine_real      712379 non-null float64
dtypes: float64(53), int32(1), int64(13), object(26)
memory usage: 508.2+ MB

Convert non-numerical features

We print and plot the distribution of the cities, to see the long tail and decide how many of them to keep.

In [59]:
city_count = train_test_set['city'].value_counts()
print(city_count.to_string())
print(city_count.shape)
Las Vegas                           208437
Phoenix                              71126
Toronto                              57047
Charlotte                            40190
Scottsdale                           36579
Pittsburgh                           26891
Henderson                            21518
Montréal                            18531
Tempe                                18354
Mesa                                 17235
Chandler                             15441
Gilbert                              14055
Cleveland                            12873
Glendale                             10742
Madison                               9688
Markham                               8248
Calgary                               7958
Mississauga                           7192
Peoria                                6582
North Las Vegas                       4771
Surprise                              4117
Richmond Hill                         3886
Goodyear                              3020
Concord                               2905
Scarborough                           2544
Champaign                             2513
Vaughan                               2496
Avondale                              2451
Huntersville                          2046
Lakewood                              2008
Matthews                              1986
North York                            1977
Fort Mill                             1851
Brampton                              1643
Cave Creek                            1538
Gastonia                              1282
Cornelius                             1278
Etobicoke                             1229
Akron                                 1188
Westlake                              1118
Boulder City                          1059
Thornhill                              985
Strongsville                           957
Middleton                              930
Mentor                                 864
Fountain Hills                         836
Cuyahoga Falls                         828
Rock Hill                              820
Cleveland Heights                      813
Parma                                  789
Kent                                   745
Litchfield Park                        745
Aurora                                 712
North Olmsted                          693
Beachwood                              654
Independence                           648
Pineville                              647
Medina                                 647
Newmarket                              637
Belmont                                631
Monroeville                            610
Willoughby                             607
Rocky River                            579
Solon                                  576
Oakville                               573
Indian Trail                           545
Avon                                   543
Bridgeville                            512
Davidson                               507
Hudson                                 500
Pickering                              493
Laval                                  491
Urbana                                 491
Ajax                                   479
Fitchburg                              464
Waxhaw                                 462
Montreal                               441
Sun Prairie                            434
Homestead                              429
Paradise Valley                        425
Chagrin Falls                          410
Stow                                   406
Canonsburg                             400
Woodbridge                             374
Wexford                                368
Whitby                                 367
Woodmere                               360
Carnegie                               358
Lyndhurst                              358
Coraopolis                             341
Fairlawn                               341
Bethel Park                            331
Unionville                             326
Elyria                                 319
Verona                                 315
Berea                                  313
Spring Valley                          308
Sewickley                              306
Orange Village                         297
Carefree                               294
Denver                                 288
Brossard                               284
Brunswick                              282
Sun City                               280
Indian Land                            267
Broadview Heights                      259
Fairview Park                          254
Monona                                 253
Laveen                                 253
Harrisburg                             252
Avon Lake                              249
Tolleson                               243
Twinsburg                              243
Mayfield Heights                       242
Buckeye                                242
Middleburg Heights                     239
Macedonia                              238
Dollard-des-Ormeaux                    238
Brooklyn                               233
Oakmont                                224
Mint Hill                              224
Painesville                            223
North Royalton                         210
Kannapolis                             204
Amherst                                203
Queen Creek                            203
Streetsboro                            196
Monroe                                 194
Clover                                 193
El Mirage                              192
Orange                                 188
Murrysville                            188
Mount Holly                            185
Verdun                                 170
Dorval                                 168
Maple                                  167
Lake Wylie                             167
Chardon                                166
University Heights                     165
Valley View                            153
Highland Heights                       153
South Las Vegas                        153
Airdrie                                150
York                                   149
Tega Cay                               148
Lorain                                 147
West Mifflin                           146
McMurray                               144
Moon Township                          144
Chesterland                            141
Euclid                                 141
Allison Park                           140
Irwin                                  139
Brecksville                            138
Westmount                              138
Waunakee                               137
Brook Park                             137
Stoughton                              134
Pointe-Claire                          130
Northfield                             127
Warrensville Heights                   127
Saint-Laurent                          126
South Euclid                           126
East York                              124
McKees Rocks                           120
Bellevue                               117
Olmsted Falls                          116
Braddock                               116
Blue Diamond                           109
Ahwatukee                              109
New Kensington                         108
Seven Hills                            107
Mooresville                            106
Upper Saint Clair                      105
Longueuil                              105
North Ridgeville                       102
West Homestead                          98
Eastlake                                97
Oregon                                  97
Gibsonia                                96
Sun City West                           92
Anthem                                  90
Willoughby Hills                        90
Pleasant Hills                          90
Youngtown                               88
Willowick                               88
DeForest                                86
Parma Heights                           84
Upper St Clair                          83
Tallmadge                               82
Stouffville                             81
Bedford                                 80
Las  Vegas                              79
Robinson Township                       79
Whitchurch-Stouffville                  77
LasVegas                                77
Peninsula                               76
Saint-Sauveur                           75
Stallings                               74
Bolton                                  73
Trafford                                72
Glenshaw                                72
Mount Horeb                             71
Wesley Chapel                           70
Moreland Hills                          70
Pierrefonds                             69
Missisauga                              68
Higley                                  67
Heidelberg                              67
Millvale                                66
Vaudreuil-Dorion                        65
Lachine                                 64
N. Las Vegas                            64
Wickliffe                               62
Ambridge                                62
Finleyville                             62
Burton                                  61
Munroe Falls                            61
Brentwood                               59
Shaker Heights                          59
Tarentum                                59
McKeesport                              59
Sun Lakes                               58
Georgetown                              57
Rantoul                                 57
Bay Village                             57
East Gwillimbury                        56
Cecil                                   56
Wadsworth                               56
Lasalle                                 56
Richfield                               55
Mt. Lebanon                             55
Kirkland                                55
Mc Kees Rocks                           54
Oakdale                                 54
North Versailles                        54
Mayfield                                53
MESA                                    53
King City                               52
Blakeney                                51
Enterprise                              50
Ross Township                           50
Cramerton                               50
Mahomet                                 49
North Randall                           49
Pepper Pike                             48
McAdenville                             48
Cottage Grove                           47
Wilkinsburg                             47
Outremont                               47
lyndhurst                               46
Savoy                                   45
Grafton                                 45
Turtle Creek                            45
Maple Heights                           45
Lowell                                  44
Springdale                              44
Concord Mills                           44
Valley City                             43
Richmond Heights                        43
White Oak                               43
Bradford                                43
Bedford Heights                         42
West View                               42
Dallas                                  42
GILBERT                                 42
Moon                                    41
L'ÃŽle-Perrot                           40
Grand River                             40
Monticello                              40
Sheffield Village                       40
Hampton Township                        39
Rexdale                                 39
Lower Lawrenceville                     39
Guadalupe                               38
McFarland                               38
Bradford West Gwillimbury               37
North Huntingdon                        37
Copley                                  36
Caledon                                 36
Laveen Village                          36
Summerlin                               35
Fairport Harbor                         35
Gates Mills                             35
Elizabeth                               34
Mentor-on-the-Lake                      34
Saint-Hubert                            33
Mayfield Village                        33
Boucherville                            33
De Forest                               33
Apache Junction                         33
Windsor                                 33
Kirtland                                33
Locust                                  33
Munhall                                 32
Penn Hills                              32
Saint-Lambert                           31
Leetsdale                               31
Harmarville                             31
Tremont                                 31
Aspinwall                               31
Norton                                  30
Tornto                                  30
Crafton                                 29
Garfield Heights                        29
Robinson                                28
Weddington                              27
Sainte-Anne-de-Bellevue                 27
Saint-Léonard                          27
Plum                                    26
Woodmere Village                        26
Blawnox                                 26
La Prairie                              26
Castle Shannon                          26
Rio Verde                               26
Fort McDowell                           25
Middlefield                             25
Scottdale                               24
charlotte                               24
Newbury                                 23
Sharpsburg                              23
Robinson Twp.                           23
Cross Plains                            23
Mount Lebanon                           23
N Las Vegas                             22
Port Credit                             22
Rocky View                              22
Glendale Az                             22
Saint-Jérôme                          22
Saint-Jean-sur-Richelieu                22
Emsworth                                21
Lower Burrell                           21
Clairton                                21
Mont-Royal                              21
Stanley                                 21
Chambly                                 21
Montrose                                21
Cheswick                                20
Etna                                    20
Chestermere                             19
Harrison City                           19
Mc Murray                               19
Walton Hills                            19
Halton Hills                            19
Monongahela                             18
Midland                                 18
South Park                              18
Anjou                                   18
North Olmstead                          17
Blainville                              17
Greenfield Park                         17
Dollard-Des-Ormeaux                     16
Brooklin                                16
Ogden                                   16
Mt Lebanon                              16
Imperial                                16
St-Benoît de Mirabel                   15
LaSalle                                 15
Baldwin                                 15
St Joseph                               15
Ranlo                                   15
Montreal-Nord                           15
Export                                  15
LAS VEGAS                               15
Richmond Hil                            15
Don Mills                               14
Columbia Station                        14
Mc Farland                              14
Nellis AFB                              14
Kleinburg                               14
Brookpark                               14
McCandless Township                     14
New Eagle                               13
Olmsted Township                        13
Rouses Point                            13
Sainte-Thérèse                        13
Phx                                     13
Boisbriand                              13
Saint-Lazare                            12
Glassport                               12
Dollard-des Ormeaux                     12
Peters Township                         12
Montréal-Ouest                         12
Harrisbug                               12
Green Tree                              11
Saint-Eustache                          11
Elizabeth Township                      11
Sagamore Hills                          11
Mc Donald                               11
Indian land                             11
Lower burrell                           11
Presto                                  11
Jefferson Hills                         11
Sainte-Anne-De-Bellevue                 11
Terrebonne                              11
South Park Township                     11
Waddell                                 11
Avalon                                  11
Downtown                                11
Etibicoke                               10
Bainbridge                              10
Beaconsfield                            10
Rosemère                               10
Saint-Hyacinthe                         10
Sainte-Adèle                           10
East Liberty                            10
La Salle                                10
PHOENIX                                 10
Russellton                              10
Mont-Saint-Hilaire                      10
Lagrange                                10
Mint  Hill                              10
Lachute                                  9
Swissvale                                9
Medina Township                          9
Mentor On The Lake                       9
East Ajax                                9
Neville Island                           9
Paradise                                 9
Forest Hills                             9
North  York                              9
Warrensville Hts.                        9
Mantua                                   9
Châteauguay                             8
Hinckley                                 8
N Ridgeville                             8
Schomberg                                8
Moseley                                  8
Balzac                                   8
Tuscola                                  8
Wilmerding                               8
Oakland                                  8
East Cleveland                           8
Bainbridge Township                      8
North Toronto                            7
Montréal-Nord                           7
Montreal-West                            7
Belle Vernon                             7
Nobleton                                 7
Central                                  7
Belleville                               7
Willowdale                               7
St-Jerome                                7
Mont-Saint-Grégoire                     7
Moon TWP                                 7
Vimont                                   7
Rigaud                                   7
Mascouche                                7
Milton                                   7
Sheffield                                7
Library                                  7
Ville Mont-Royal                         7
Glen Williams                            7
Lowesville                               7
Ben Avon                                 6
Sunrise Manor                            6
Sheffield Lake                           6
solon                                    6
Brunswick Hills                          6
Cuddy                                    6
Goodwood                                 6
Shaler Township                          6
Sainte-Adele                             6
Whiitby                                  6
Venetia                                  6
Indianola                                6
Arnold                                   6
Saintt-Bruno-de-Montarville              6
Thornhil                                 5
Duquesne                                 5
Chateauguay                              5
Wilkins Township                         5
Dane                                     5
Communauté-Urbaine-de-Montréal         5
SCOTTSDALE                               5
GOODYEAR                                 5
Bellvue                                  5
Herminie                                 5
Mt Holly                                 5
Omaha                                    5
Kahnawake                                5
Bath                                     5
Sturgeon                                 5
Troy Township                            5
Sainte-Genevieve                         5
Saint-Leonard                            5
Mckees Rocks                             5
McCandless                               5
TORONTO                                  5
Mesa AZ                                  5
St-Leonard                               5
Mt. Holly                                5
Auburn Township                          4
Chargrin Falls                           4
Sainte-Julie                             4
NELLIS AFB                               4
Deforest                                 4
Saint-Marc-sur-Richelieu                 4
Ravenna                                  4
Dravosburg                               4
Mcfarland                                4
ETOBICOKE                                4
St Leonard                               4
King                                     4
Mcknight                                 4
Uxbridge                                 4
L'ile-Perrot                             4
Fort Mcdowell                            4
Saint-Bruno-de-Montarville               4
Beeton                                   4
Saint-Jerome                             4
Elrama                                   4
Litchfield                               4
Saint Joseph                             4
Dollard-Des Ormeaux                      4
St-Bruno-de-Montarville                  4
L'ÃŽle-Bizard                            4
Scarobrough                              4
Lawrence                                 4
Saint-Constant                           4
Lindale                                  4
Beloeil                                  4
Repentigny                               4
Estérel                                 4
Mcmurray                                 4
Homer                                    3
Delson                                   3
McDonald                                 3
Morin-Heights                            3
Central City                             3
South Amherst                            3
Painesville Township                     3
Canonsburd                               3
Mount Albert                             3
Paoli                                    3
Nellis Air Force Base                    3
Northwest Calgary                        3
Oakwood Village                          3
Central City Village                     3
Concord Township                         3
Rostraver                                3
Mirabel                                  3
Saint-Henri                              3
Fisher                                   3
Harwick                                  3
Mooers                                   3
Rural Ridge                              3
verdun                                   3
Champlain                                3
Lake Park                                3
River Drive Park                         3
Etobiicoke                               3
Montgomery                               3
East Mc Keesport                         3
Marshall                                 3
Gifford                                  2
Auburn Twp                               2
Northfield Center Township               2
Mount Oliver                             2
Mount Washington                         2
Midnapore                                2
Pointe-Calumet                           2
Creighton                                2
Mentor On the Lake                       2
Hyland Heights                           2
Saint Laurent                            2
Holland Landing                          2
Palgrave                                 2
kirtland                                 2
Saint-Basile-Le-Grand                    2
Schottsdale                              2
Pointe-Aux-Trembles                      2
Venise-en-Québec                        2
Sainte-Marguerite-Esterel                2
Richmond Hts                             2
Thorncliffe Park                         2
Toronto Scarborough                      2
Rillton                                  2
Val-Morin                                2
Ross                                     2
Clark                                    2
Penn Hills Township                      2
East Pittsburgh                          2
Bois-des-Filion                          2
Joliette                                 2
Saint-Basile-le-Grand                    2
Cleveland Hghts.                         2
Hendersonville                           2
Mckeesport                               2
Sauk City                                2
Pincourt                                 2
Brookline                                2
Clarkson                                 2
Rocky View No. 44                        2
THORNHILL                                2
Waterloo                                 2
Candiac                                  2
Sainte-Thérèse-de-Blainville           2
Saint-Hippolyte                          2
Rosemere                                 2
Hemmingford                              2
Montréal-West                           1
North Braddock                           1
Oakwood                                  1
Hampstead                                1
Tottenham                                1
Saint-laurent                            1
Salaberry-De-Valleyfield                 1
Caledon Village                          1
Henryville                               1
Les Coteaux                              1
Cote Saint-Luc                           1
Port Vue                                 1
Sainte-Marthe                            1
Aliquippa                                1
Laval, Ste Dorothee                      1
Sainte-Catherine                         1
East McKeesport                          1
Sainte-Marguerite-du-lac-Masson          1
North Huntington                         1
Pitcairn                                 1
Cuyahoga Fls                             1
Québec                                  1
Saint-Bruno                              1
Lawrenceville                            1
Salaberry-de-Valleyfield                 1
Caledon East                             1
Inglewood                                1
Fabreville                               1
Shorewood Hills                          1
Midway                                   1
Edgewood                                 1
cave creek                               1
Tolono                                   1
Malton                                   1
Beauharnois                              1
Ange-Gardien                             1
West Elizabeth                           1
Richmond Height                          1
Brownsburg-Chatham                       1
Iberville                                1
Montreal-Est                             1
Oak Ridges                               1
Rocky View County                        1
Kennedy Township                         1
Huntingdon                               1
Bratenahl                                1
Napierville                              1
Plum Boro                                1
De Winton                                1
St. Léonard                             1
Mississuaga                              1
Buena Vista                              1
Rawdon                                   1
Black Earth                              1
North Strabane Township                  1
L'Assomption                             1
Ste-Rose                                 1
springdale                               1
Charlemagne                              1
Les Cèdres                              1
Agincourt                                1
ÃŽle-des-Soeurs                          1
Mercier                                  1
Chatauguay                               1
Sun Praiie                               1
LaGrange                                 1
Coteau-du-Lac                            1
Deerfield                                1
Cuyahoga Heights                         1
Paw Creek                                1
(671,)
In [60]:
data = train_test_set.loc[train_test_set['city']!="Las Vegas", 'city']
weights = _np.ones(len(data)) / len(data)
_plt.figure(figsize=(20,10))
_plt.hist(data, weights=weights, bins=100)
_plt.title("City distribution")
_plt.gca().yaxis.set_major_formatter(_PercentFormatter(1))
_plt.show()
In [61]:
main_cities = city_count.where(city_count >= 100).dropna()
print(main_cities.to_string())
print(main_cities.shape)
main_cities = '|'.join(list(main_cities.index))
Las Vegas               208437.0
Phoenix                  71126.0
Toronto                  57047.0
Charlotte                40190.0
Scottsdale               36579.0
Pittsburgh               26891.0
Henderson                21518.0
Montréal                18531.0
Tempe                    18354.0
Mesa                     17235.0
Chandler                 15441.0
Gilbert                  14055.0
Cleveland                12873.0
Glendale                 10742.0
Madison                   9688.0
Markham                   8248.0
Calgary                   7958.0
Mississauga               7192.0
Peoria                    6582.0
North Las Vegas           4771.0
Surprise                  4117.0
Richmond Hill             3886.0
Goodyear                  3020.0
Concord                   2905.0
Scarborough               2544.0
Champaign                 2513.0
Vaughan                   2496.0
Avondale                  2451.0
Huntersville              2046.0
Lakewood                  2008.0
Matthews                  1986.0
North York                1977.0
Fort Mill                 1851.0
Brampton                  1643.0
Cave Creek                1538.0
Gastonia                  1282.0
Cornelius                 1278.0
Etobicoke                 1229.0
Akron                     1188.0
Westlake                  1118.0
Boulder City              1059.0
Thornhill                  985.0
Strongsville               957.0
Middleton                  930.0
Mentor                     864.0
Fountain Hills             836.0
Cuyahoga Falls             828.0
Rock Hill                  820.0
Cleveland Heights          813.0
Parma                      789.0
Kent                       745.0
Litchfield Park            745.0
Aurora                     712.0
North Olmsted              693.0
Beachwood                  654.0
Independence               648.0
Pineville                  647.0
Medina                     647.0
Newmarket                  637.0
Belmont                    631.0
Monroeville                610.0
Willoughby                 607.0
Rocky River                579.0
Solon                      576.0
Oakville                   573.0
Indian Trail               545.0
Avon                       543.0
Bridgeville                512.0
Davidson                   507.0
Hudson                     500.0
Pickering                  493.0
Laval                      491.0
Urbana                     491.0
Ajax                       479.0
Fitchburg                  464.0
Waxhaw                     462.0
Montreal                   441.0
Sun Prairie                434.0
Homestead                  429.0
Paradise Valley            425.0
Chagrin Falls              410.0
Stow                       406.0
Canonsburg                 400.0
Woodbridge                 374.0
Wexford                    368.0
Whitby                     367.0
Woodmere                   360.0
Carnegie                   358.0
Lyndhurst                  358.0
Coraopolis                 341.0
Fairlawn                   341.0
Bethel Park                331.0
Unionville                 326.0
Elyria                     319.0
Verona                     315.0
Berea                      313.0
Spring Valley              308.0
Sewickley                  306.0
Orange Village             297.0
Carefree                   294.0
Denver                     288.0
Brossard                   284.0
Brunswick                  282.0
Sun City                   280.0
Indian Land                267.0
Broadview Heights          259.0
Fairview Park              254.0
Monona                     253.0
Laveen                     253.0
Harrisburg                 252.0
Avon Lake                  249.0
Tolleson                   243.0
Twinsburg                  243.0
Mayfield Heights           242.0
Buckeye                    242.0
Middleburg Heights         239.0
Macedonia                  238.0
Dollard-des-Ormeaux        238.0
Brooklyn                   233.0
Oakmont                    224.0
Mint Hill                  224.0
Painesville                223.0
North Royalton             210.0
Kannapolis                 204.0
Amherst                    203.0
Queen Creek                203.0
Streetsboro                196.0
Monroe                     194.0
Clover                     193.0
El Mirage                  192.0
Orange                     188.0
Murrysville                188.0
Mount Holly                185.0
Verdun                     170.0
Dorval                     168.0
Maple                      167.0
Lake Wylie                 167.0
Chardon                    166.0
University Heights         165.0
Valley View                153.0
Highland Heights           153.0
South Las Vegas            153.0
Airdrie                    150.0
York                       149.0
Tega Cay                   148.0
Lorain                     147.0
West Mifflin               146.0
McMurray                   144.0
Moon Township              144.0
Chesterland                141.0
Euclid                     141.0
Allison Park               140.0
Irwin                      139.0
Brecksville                138.0
Westmount                  138.0
Waunakee                   137.0
Brook Park                 137.0
Stoughton                  134.0
Pointe-Claire              130.0
Northfield                 127.0
Warrensville Heights       127.0
Saint-Laurent              126.0
South Euclid               126.0
East York                  124.0
McKees Rocks               120.0
Bellevue                   117.0
Olmsted Falls              116.0
Braddock                   116.0
Blue Diamond               109.0
Ahwatukee                  109.0
New Kensington             108.0
Seven Hills                107.0
Mooresville                106.0
Upper Saint Clair          105.0
Longueuil                  105.0
North Ridgeville           102.0
(176,)
In [62]:
train_test_set['city'] = train_test_set['city'].str.findall(main_cities)
train_test_set['city'] = train_test_set['city'].map(lambda x: 'Other' if x==[] else x[0])
train_test_set.head()
Out[62]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.622302 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 T2P 0K5 51.049673 -114.079977 24 4.0 True None False None None Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.846154 18 0 1 8 4 4 4 13.0 4.222222 9.0 3.798692 11.168551 3.556721 5.00000 3.686094 3.777956 3.857143 3.684951 3.000000 5.000000 3.868171 2.000000 3.542997 5.000000 3.662669 3.749776 5.000000 3.66752 3.000000 5.000000 3.851784 2.000000 3.555679 5.000000 3.678871 3.770170 3.926822 3.676024 2.866472 5.000000 3.867280 2.000000
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 T2P 0K5 51.049673 -114.079977 24 4.0 True None False None None Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.333333 18 0 0 1 1 1 454 3.0 3.333333 3.0 3.231620 2.702970 3.556721 3.79608 4.000000 3.777956 5.000000 2.500000 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 4.000000 3.749776 5.000000 2.50000 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 4.000000 3.770170 5.000000 2.353400 3.788204 3.928912 3.867280 3.767263
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 T2P 0K5 51.049673 -114.079977 24 4.0 True None False None None Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 2.571429 27 0 0 31 3 3 46 14.0 2.692308 13.0 2.692835 12.704916 2.500000 1.00000 3.000000 3.777956 3.200000 3.684951 3.789846 1.000000 2.333333 3.770015 2.500000 1.000000 3.000000 3.749776 3.200000 3.66752 3.771654 1.000000 3.000000 3.744434 2.505665 1.000000 2.990709 3.770170 3.204491 3.676024 3.788204 1.000000 2.974502 3.767263
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 1 0.988395 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 T2P 0K5 51.049673 -114.079977 24 4.0 True None False None None Beer&Wine Restaurants, Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.703313 2 0 0 1 0 0 622 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese, Restaurants Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.000000 2.980769 212 3 23 307 157 71 2902 104.0 3.000000 88.0 2.997169 88.020993 2.966667 4.00000 3.686094 2.000000 3.040000 2.600000 3.600000 2.500000 1.000000 3.750000 2.923077 4.000000 3.662669 1.666667 3.146341 2.60000 3.750000 2.500000 1.000000 3.666667 2.954478 4.000000 3.678871 1.826533 3.115334 2.630356 3.621347 2.500236 1.000000 3.828739

We print and plot the distribution of the categories, to see the long tail and decide how many of them to keep.

In [63]:
category_count = _pd.Series(', '.join(list(train_test_set['categories'])).split(', ')).value_counts()
print(category_count.to_string())
print(category_count.shape)
Restaurants                         712379
Food                                192841
Nightlife                           170259
Bars                                165959
American (Traditional)              123975
Breakfast & Brunch                  120310
American (New)                      117437
Sandwiches                           82264
Mexican                              73726
Burgers                              71598
Pizza                                67538
Italian                              62490
Seafood                              57228
Japanese                             53952
Salad                                49388
Fast Food                            46982
Event Planning & Services            45382
Coffee & Tea                         43592
Chinese                              42971
Asian Fusion                         42576
Sushi Bars                           42487
Cafes                                40002
Steakhouses                          37097
Desserts                             36523
Cocktail Bars                        36152
Barbeque                             33004
Sports Bars                          26918
Vegetarian                           25616
Chicken Wings                        24833
Vegan                                23768
Wine Bars                            23685
Caterers                             23328
Mediterranean                        22625
Specialty Food                       21980
Thai                                 21702
Bakeries                             21321
Wine & Spirits                       21204
Beer                                 21204
Pubs                                 21076
Arts & Entertainment                 20724
Diners                               20452
Gastropubs                           17953
Korean                               17831
Gluten-Free                          17217
Buffets                              17133
Juice Bars & Smoothies               16826
Soup                                 16070
Noodles                              15555
Lounges                              15291
Delis                                14963
Vietnamese                           14470
Southern                             14139
Comfort Food                         13805
Tapas/Small Plates                   13335
French                               12767
Beer Bar                             12567
Venues & Event Spaces                12524
Canadian (New)                       12509
Indian                               12502
Tacos                                12369
Hotels & Travel                      11840
Middle Eastern                       11447
Ramen                                11015
Greek                                10833
Tex-Mex                              10615
Latin American                       10474
Hotels                               10261
Hawaiian                              9674
Ice Cream & Frozen Yogurt             9485
Breweries                             9293
Casinos                               9055
Tapas Bars                            8057
Ethnic Food                           7862
Chicken Shop                          6779
Hot Dogs                              6534
Cajun/Creole                          6237
Food Delivery Services                6237
Halal                                 5951
Poke                                  5903
Music Venues                          5663
Local Flavor                          5617
Soul Food                             5577
Shopping                              5421
Caribbean                             5311
Food Trucks                           5295
Bagels                                5287
Bubble Tea                            5119
Dim Sum                               5023
Grocery                               4705
Active Life                           4482
New Mexican Cuisine                   4385
Taiwanese                             4155
Creperies                             4008
Fish & Chips                          3903
Waffles                               3781
Resorts                               3552
Street Vendors                        3490
Pakistani                             3441
Spanish                               3393
Donuts                                3392
Tea Rooms                             3247
Wraps                                 3114
British                               3019
Dive Bars                             2989
Filipino                              2898
Cantonese                             2686
Dance Clubs                           2642
Karaoke                               2635
Brazilian                             2614
Cheesesteaks                          2594
Hot Pot                               2589
Irish                                 2566
Beauty & Spas                         2503
Modern European                       2479
Beer Gardens                          2269
Food Stands                           2191
Lebanese                              2170
Live/Raw Food                         2108
Smokehouse                            2062
Szechuan                              1985
Meat Shops                            1940
Arcades                               1925
African                               1904
Party & Event Planning                1852
German                                1814
Patisserie/Cake Shop                  1805
Health Markets                        1785
Poutineries                           1727
Day Spas                              1724
Imported Food                         1700
Izakaya                               1668
Brewpubs                              1638
Coffee Roasteries                     1622
Teppanyaki                            1577
Brasseries                            1461
Irish Pub                             1460
Peruvian                              1452
Pan Asian                             1441
Food Court                            1419
Internet Cafes                        1415
Persian/Iranian                       1402
Pasta Shops                           1357
Turkish                               1330
Acai Bowls                            1317
Kebab                                 1315
Falafel                               1275
Portuguese                            1260
Chocolatiers & Shops                  1258
Cuban                                 1257
Butcher                               1251
Bistros                               1237
Golf                                  1212
Seafood Markets                       1194
Whiskey Bars                          1173
Shopping Centers                      1146
Farmers Market                        1141
Automotive                            1130
Do-It-Yourself Food                   1122
Mongolian                             1073
Amusement Parks                       1066
Bowling                               1056
Conveyor Belt Sushi                   1053
Hookah Bars                           1051
Cupcakes                              1018
Fondue                                1005
Gelato                                1003
Polish                                1003
Singaporean                            965
Cinema                                 964
Shaved Ice                             951
Custom Cakes                           938
Malaysian                              926
Cheese Shops                           904
Afghan                                 900
Jazz & Blues                           884
Argentine                              870
Laotian                                848
Fruits & Veggies                       843
Performing Arts                        812
Kosher                                 808
Flowers & Gifts                        798
Organic Stores                         789
Colombian                              751
Venezuelan                             739
Convenience Stores                     735
Wedding Planning                       721
Arabian                                706
Japanese Curry                         705
Himalayan/Nepalese                     675
Belgian                                654
Ethiopian                              644
Fashion                                642
Car Wash                               641
Pool Halls                             588
Kids Activities                        584
Local Services                         582
Home & Garden                          569
Tuscan                                 551
Home Services                          542
Candy Stores                           522
Gas Stations                           501
Eatertainment                          489
Delicatessen                           475
Professional Services                  471
Russian                                468
Adult Entertainment                    467
Salvadoran                             467
Gift Shops                             459
Hair Salons                            453
International Grocery                  451
Health & Medical                       429
Moroccan                               426
Social Clubs                           407
Cafeteria                              381
Piano Bars                             376
Pretzels                               364
Hungarian                              360
Playgrounds                            352
Bed & Breakfast                        351
Ethnic Grocery                         349
Public Markets                         345
Speakeasies                            344
Beverage Store                         341
Art Galleries                          333
Hakka                                  326
Egyptian                               324
Pets                                   323
Puerto Rican                           310
Museums                                307
Ukrainian                              306
Macarons                               304
Education                              303
Wineries                               301
Empanadas                              300
Beer Garden                            291
Dinner Theater                         289
International                          288
Shaved Snow                            284
Nail Salons                            283
Arts & Crafts                          280
Indonesian                             265
Airports                               265
Personal Chefs                         263
Country Dance Halls                    263
Outlet Stores                          262
Musicians                              259
Public Services & Government           249
Hong Kong Style Cafe                   247
Real Estate                            245
Hostels                                245
Themed Cafes                           241
Syrian                                 236
Donairs                                236
Employment Agencies                    232
Furniture Stores                       231
Swimming Pools                         229
Massage                                228
Cambodian                              213
Department Stores                      213
Tours                                  211
Distilleries                           198
Dominican                              198
Mini Golf                              195
Zoos                                   195
Hair Removal                           191
Scandinavian                           183
Armenian                               179
Florists                               178
Shanghainese                           177
Festivals                              173
Waxing                                 172
Drugstores                             171
Kombucha                               171
Cooking Classes                        170
Tabletop Games                         170
Eyelash Service                        169
Personal Shopping                      168
Bangladeshi                            165
Nail Technicians                       165
Botanical Gardens                      165
Gay Bars                               163
Appliances                             160
Fitness & Instruction                  158
Towing                                 157
Barbers                                156
Wholesale Stores                       154
Sri Lankan                             154
Home Decor                             152
Landmarks & Historical Buildings       151
Nutritionists                          148
Kitchen & Bath                         146
Community Service/Non-Profit           145
Basque                                 137
Tiki Bars                              137
South African                          127
Wedding Chapels                        119
Books                                  117
Music & Video                          117
Mags                                   117
Antiques                               112
Sporting Goods                         110
Animal Shelters                        110
Skin Care                              110
Stadiums & Arenas                      109
Pop-Up Restaurants                     106
Guamanian                              106
Food Tours                             105
Tobacco Shops                          100
Cosmetics & Beauty Supply               99
Wine Tasting Room                       98
Laser Tag                               97
Herbs & Spices                          97
Printing Services                       94
Ethical Grocery                         93
Screen Printing                         93
Specialty Schools                       93
Walking Tours                           92
Recreation Centers                      91
Cabaret                                 89
Burmese                                 89
Cideries                                89
Udon                                    88
Coffee & Tea Supplies                   78
Gyms                                    78
Champagne Bars                          77
Street Art                              76
Austrian                                74
Baby Gear & Furniture                   74
Wigs                                    74
Health Retreats                         73
Australian                              73
Building Supplies                       72
Supper Clubs                            72
Doctors                                 72
Couriers & Delivery Services            67
Czech                                   67
Tasting Classes                         67
Auto Glass Services                     66
Country Clubs                           65
Yoga                                    65
Sports Wear                             64
Religious Organizations                 64
Horseback Riding                        61
Mass Media                              61
Shared Office Spaces                    61
Team Building Activities                60
Hobby Shops                             60
Shoe Stores                             60
Virtual Reality Centers                 60
Cooking Schools                         59
Veterinarians                           59
Pet Services                            59
Art Classes                             59
Haitian                                 59
Contractors                             56
Wholesalers                             56
Bartenders                              54
Outdoor Furniture Stores                53
Appliances & Repair                     52
Hospitals                               52
Accessories                             51
Parks                                   50
Uzbek                                   49
Financial Services                      49
Comedy Clubs                            48
Vehicle Wraps                           47
Farms                                   47
Honduran                                47
Art Museums                             47
Car Window Tinting                      47
Cultural Center                         47
Vacation Rentals                        47
Coffeeshops                             46
Women's Clothing                        45
Cigar Bars                              45
DJs                                     45
Vitamins & Supplements                  44
Dentists                                44
Trinidadian                             44
Trainers                                43
Bookstores                              43
Iberian                                 42
Emergency Medicine                      40
Skating Rinks                           40
Cardiologists                           40
Jewelry                                 40
Pediatricians                           40
Strip Clubs                             39
Auto Repair                             39
Sports Clubs                            39
Party Equipment Rentals                 39
Leisure Centers                         39
Indoor Playcentre                       39
Mattresses                              38
Hotel bar                               37
Colleges & Universities                 37
Nicaraguan                              37
Home Window Tinting                     37
Transportation                          37
Weight Loss Centers                     36
Party Supplies                          36
Pharmacy                                36
Bar Crawl                               36
Used                                    36
Vintage & Consignment                   36
Airport Terminals                       36
Men's Clothing                          34
Vinyl Records                           34
Cheese Tasting Classes                  34
Limos                                   34
Wine Tasting Classes                    33
Escape Games                            32
Rock Climbing                           32
Physical Therapy                        32
Dry Cleaning & Laundry                  32
Adult Education                         31
Cannabis Clinics                        31
RV Parks                                31
Real Estate Services                    31
Toy Stores                              30
General Dentistry                       30
Rehabilitation Center                   30
Sicilian                                30
CSA                                     29
Home Cleaning                           29
Pet Adoption                            28
Photography Stores & Services           28
Sugar Shacks                            28
Trampoline Parks                        28
Climbing                                28
Cards & Stationery                      27
Amateur Sports Teams                    25
Plumbing                                25
Popcorn Shops                           25
Cannabis Dispensaries                   25
Discount Store                          25
Hair Stylists                           25
Grilling Equipment                      24
Currency Exchange                       24
Cannabis Collective                     24
Check Cashing/Pay-day Loans             24
Fireplace Services                      23
Art Schools                             22
Opera & Ballet                          22
Security Systems                        22
Leather Goods                           22
Christmas Trees                         21
Car Rental                              21
Travel Services                         21
Post Offices                            20
Optometrists                            20
Auto Parts & Supplies                   20
Pet Stores                              20
Transmission Repair                     20
Bikes                                   20
Rotisserie Chicken                      19
Electronics                             19
Libraries                               19
Oil Change Stations                     19
Nurseries & Gardening                   19
Body Shops                              19
Auto Insurance                          19
Windshield Installation & Repair        19
Insurance                               19
Horse Racing                            19
Massage Therapy                         18
Videos & Video Game Rental              18
Family Practice                         17
Golf Lessons                            17
Makeup Artists                          17
Airport Shuttles                        16
Office Equipment                        16
Pool & Billiards                        16
Tennis                                  16
Bounce House Rentals                    16
Laundry Services                        16
Party Bus Rentals                       16
Eyebrow Services                        15
Dry Cleaning                            15
Security Services                       15
Race Tracks                             15
Personal Assistants                     15
Club Crawl                              15
Skydiving                               14
Sailing                                 14
Archery                                 14
Airport Lounges                         14
Laser Hair Removal                      14
Olive Oil                               14
Tanning                                 14
Video Game Stores                       13
Scottish                                13
Tanning Beds                            12
Pita                                    12
Interior Design                         12
Used Bookstore                          12
Spray Tanning                           12
Bingo Halls                             11
Yelp Events                             11
Reflexology                             11
Movers                                  11
Slovakian                               11
Boat Charters                           11
Town Car Service                        11
Tires                                   11
Reunion                                 11
Septic Services                         11
Lakes                                   11
Funeral Services & Cemeteries           11
Auto Customization                      11
Drive-Thru Bars                         11
Boating                                 11
Game Meat                               11
Ice Delivery                            11
Life Coach                              11
Summer Camps                            10
Soba                                    10
Golf Equipment                          10
Eyewear & Opticians                     10
Supernatural Readings                   10
Signmaking                              10
Psychics                                10
Mobile Phones                           10
Photographers                           10
Paint & Sip                             10
Ophthalmologists                        10
Hair Extensions                          9
Golf Cart Dealers                        9
Historical Tours                         9
Car Dealers                              9
Meditation Centers                       8
Oaxacan                                  8
Pick Your Own Farms                      8
Business Consulting                      8
Masonry/Concrete                         8
Reiki                                    8
Comic Books                              8
Psychologists                            8
Commercial Truck Repair                  8
Pumpkin Patches                          8
Bike Rentals                             8
Electricians                             8
Photo Booth Rentals                      8
Truck Rental                             8
Counseling & Mental Health               8
Graphic Design                           7
Mortgage Brokers                         7
Recording & Rehearsal Studios            7
Bus Tours                                7
Parenting Classes                        7
Guest Houses                             7
Tonkatsu                                 7
Preschools                               7
Bike Repair/Maintenance                  7
Head Shops                               7
Restaurant Supplies                      7
Ticket Sales                             7
Tempura                                  7
Dance Schools                            7
Real Estate Agents                       7
Audio/Visual Equipment Rental            6
Art Supplies                             6
Furniture Repair                         6
Pilates                                  6
Trophy Shops                             6
Souvenir Shops                           6
Piercing                                 6
Traditional Clothing                     6
Lighting Fixtures & Equipment            6
Bridal                                   6
Tattoo                                   6
Pool & Hot Tub Service                   6
Fur Clothing                             6
Beach Bars                               6
Chiropractors                            6
Martial Arts                             6
Furniture Rental                         6
Soccer                                   6
Carpet Installation                      5
Handyman                                 5
Medical Spas                             5
Pressure Washers                         5
Gutter Services                          5
Beaches                                  5
Acupuncture                              5
Hot Tub & Pool                           5
Tax Services                             5
Window Washing                           5
Windows Installation                     5
Heating & Air Conditioning/HVAC          5
Golf Equipment Shops                     5
Dog Walkers                              5
Pet Sitting                              5
Kids Hair Salons                         5
Property Management                      4
Door Sales/Installation                  4
Gun/Rifle Ranges                         4
Wine Tours                               4
Bankruptcy Law                           4
Bocce Ball                               4
Honey                                    4
Glass & Mirrors                          4
Lawyers                                  4
Hats                                     4
Knife Sharpening                         4
Software Development                     4
Tickets                                  4
Blow Dry/Out Services                    4
Keys & Locksmiths                        4
Water Stores                             4
Holistic Animal Care                     3
Calabrian                                3
Animal Assisted Therapy                  3
Animal Physical Therapy                  3
Swiss Food                               3
Aquarium Services                        3
Local Fish Stores                        3
Diagnostic Services                      3
Brazilian Jiu-jitsu                      3
Mauritius                                3
Laboratory Testing                       3
Thrift Stores                            3
Cosmetic Dentists                        3
Surf Schools                             3
Tai Chi                                  3
Vape Shops                               3
Private Tutors                           3
Observatories                            3
Ski Resorts                              3
Web Design                               3
Aquariums                                3
Studio Taping                            3
Churches                                 3
Advertising                              3
Food Banks                               3
Hardware Stores                          3
Day Camps                                3
Apartments                               3
Occupational Therapy                     3
Airsoft                                  3
Pet Training                             3
Swimwear                                 3
Hainan                                   3
Accountants                              3
Unofficial Yelp Events                   3
Police Departments                       3
Pop-up Shops                             3
Air Duct Cleaning                        3
Paint-Your-Own Pottery                   3
Aircraft Repairs                         3
Flea Markets                             2
Pub Food                                 2
Divorce & Family Law                     2
Engraving                                2
Naturopathic/Holistic                    2
Marinas                                  2
Plus Size Fashion                        2
Immigration Law                          2
Computers                                2
Wills                                    2
Magicians                                2
Medical Cannabis Referrals               2
Car Share Services                       2
Bespoke Clothing                         2
Public Transportation                    2
Hunting & Fishing Supplies               2
Boat Dealers                             2
Medical Centers                          2
Trusts                                   2
Tax Law                                  2
Community Centers                        2
Boat Repair                              2
Outdoor Gear                             2
Pet Boarding                             2
Formal Wear                              2
Marketing                                2
Estate Planning Law                      2
Painters                                 2
Courthouses                              2
& Probates                               2
Service Stations                         2
Personal Injury Law                      2
Drywall Installation & Repair            2
Clowns                                   2
Batting Cages                            1
Beer Hall                                1
Holiday Decorations                      1
Furniture Reupholstery                   1
Churros                                  1
Clothing Rental                          1
University Housing                       1
Pet Groomers                             1
Auto Detailing                           1
Pawn Shops                               1
Senegalese                               1
Gardeners                                1
Campgrounds                              1
Banks & Credit Unions                    1
Laundromat                               1
RV Repair                                1
Roofing                                  1
Officiants                               1
Music & DVDs                             1
Home Health Care                         1
Squash                                   1
Auto Upholstery                          1
Siding                                   1
Landscaping                              1
Motorcycle Repair                        1
Ski Schools                              1
Flooring                                 1
Taxis                                    1
Foundation Repair                        1
Minho                                    1
(714,)
In [64]:
data = category_count.drop(labels=['Restaurants', 'Food']).index
vals = category_count.drop(labels=['Restaurants', 'Food']).values
weights = vals / vals.sum()
_plt.figure(figsize=(20,10))
_plt.hist(data, weights=weights, bins=100)
_plt.title("Category distribution")
_plt.gca().yaxis.set_major_formatter(_PercentFormatter(1))
_plt.show()
In [65]:
main_categories = category_count.drop(labels=['Restaurants', 'Food']).where(category_count >= 100).dropna()
print(main_categories.to_string())
print(main_categories.shape)
main_categories = '|'.join([_re.escape(x) for x in main_categories.index])
Nightlife                           170259.0
Bars                                165959.0
American (Traditional)              123975.0
Breakfast & Brunch                  120310.0
American (New)                      117437.0
Sandwiches                           82264.0
Mexican                              73726.0
Burgers                              71598.0
Pizza                                67538.0
Italian                              62490.0
Seafood                              57228.0
Japanese                             53952.0
Salad                                49388.0
Fast Food                            46982.0
Event Planning & Services            45382.0
Coffee & Tea                         43592.0
Chinese                              42971.0
Asian Fusion                         42576.0
Sushi Bars                           42487.0
Cafes                                40002.0
Steakhouses                          37097.0
Desserts                             36523.0
Cocktail Bars                        36152.0
Barbeque                             33004.0
Sports Bars                          26918.0
Vegetarian                           25616.0
Chicken Wings                        24833.0
Vegan                                23768.0
Wine Bars                            23685.0
Caterers                             23328.0
Mediterranean                        22625.0
Specialty Food                       21980.0
Thai                                 21702.0
Bakeries                             21321.0
Wine & Spirits                       21204.0
Beer                                 21204.0
Pubs                                 21076.0
Arts & Entertainment                 20724.0
Diners                               20452.0
Gastropubs                           17953.0
Korean                               17831.0
Gluten-Free                          17217.0
Buffets                              17133.0
Juice Bars & Smoothies               16826.0
Soup                                 16070.0
Noodles                              15555.0
Lounges                              15291.0
Delis                                14963.0
Vietnamese                           14470.0
Southern                             14139.0
Comfort Food                         13805.0
Tapas/Small Plates                   13335.0
French                               12767.0
Beer Bar                             12567.0
Venues & Event Spaces                12524.0
Canadian (New)                       12509.0
Indian                               12502.0
Tacos                                12369.0
Hotels & Travel                      11840.0
Middle Eastern                       11447.0
Ramen                                11015.0
Greek                                10833.0
Tex-Mex                              10615.0
Latin American                       10474.0
Hotels                               10261.0
Hawaiian                              9674.0
Ice Cream & Frozen Yogurt             9485.0
Breweries                             9293.0
Casinos                               9055.0
Tapas Bars                            8057.0
Ethnic Food                           7862.0
Chicken Shop                          6779.0
Hot Dogs                              6534.0
Cajun/Creole                          6237.0
Food Delivery Services                6237.0
Halal                                 5951.0
Poke                                  5903.0
Music Venues                          5663.0
Local Flavor                          5617.0
Soul Food                             5577.0
Shopping                              5421.0
Caribbean                             5311.0
Food Trucks                           5295.0
Bagels                                5287.0
Bubble Tea                            5119.0
Dim Sum                               5023.0
Grocery                               4705.0
Active Life                           4482.0
New Mexican Cuisine                   4385.0
Taiwanese                             4155.0
Creperies                             4008.0
Fish & Chips                          3903.0
Waffles                               3781.0
Resorts                               3552.0
Street Vendors                        3490.0
Pakistani                             3441.0
Spanish                               3393.0
Donuts                                3392.0
Tea Rooms                             3247.0
Wraps                                 3114.0
British                               3019.0
Dive Bars                             2989.0
Filipino                              2898.0
Cantonese                             2686.0
Dance Clubs                           2642.0
Karaoke                               2635.0
Brazilian                             2614.0
Cheesesteaks                          2594.0
Hot Pot                               2589.0
Irish                                 2566.0
Beauty & Spas                         2503.0
Modern European                       2479.0
Beer Gardens                          2269.0
Food Stands                           2191.0
Lebanese                              2170.0
Live/Raw Food                         2108.0
Smokehouse                            2062.0
Szechuan                              1985.0
Meat Shops                            1940.0
Arcades                               1925.0
African                               1904.0
Party & Event Planning                1852.0
German                                1814.0
Patisserie/Cake Shop                  1805.0
Health Markets                        1785.0
Poutineries                           1727.0
Day Spas                              1724.0
Imported Food                         1700.0
Izakaya                               1668.0
Brewpubs                              1638.0
Coffee Roasteries                     1622.0
Teppanyaki                            1577.0
Brasseries                            1461.0
Irish Pub                             1460.0
Peruvian                              1452.0
Pan Asian                             1441.0
Food Court                            1419.0
Internet Cafes                        1415.0
Persian/Iranian                       1402.0
Pasta Shops                           1357.0
Turkish                               1330.0
Acai Bowls                            1317.0
Kebab                                 1315.0
Falafel                               1275.0
Portuguese                            1260.0
Chocolatiers & Shops                  1258.0
Cuban                                 1257.0
Butcher                               1251.0
Bistros                               1237.0
Golf                                  1212.0
Seafood Markets                       1194.0
Whiskey Bars                          1173.0
Shopping Centers                      1146.0
Farmers Market                        1141.0
Automotive                            1130.0
Do-It-Yourself Food                   1122.0
Mongolian                             1073.0
Amusement Parks                       1066.0
Bowling                               1056.0
Conveyor Belt Sushi                   1053.0
Hookah Bars                           1051.0
Cupcakes                              1018.0
Fondue                                1005.0
Gelato                                1003.0
Polish                                1003.0
Singaporean                            965.0
Cinema                                 964.0
Shaved Ice                             951.0
Custom Cakes                           938.0
Malaysian                              926.0
Cheese Shops                           904.0
Afghan                                 900.0
Jazz & Blues                           884.0
Argentine                              870.0
Laotian                                848.0
Fruits & Veggies                       843.0
Performing Arts                        812.0
Kosher                                 808.0
Flowers & Gifts                        798.0
Organic Stores                         789.0
Colombian                              751.0
Venezuelan                             739.0
Convenience Stores                     735.0
Wedding Planning                       721.0
Arabian                                706.0
Japanese Curry                         705.0
Himalayan/Nepalese                     675.0
Belgian                                654.0
Ethiopian                              644.0
Fashion                                642.0
Car Wash                               641.0
Pool Halls                             588.0
Kids Activities                        584.0
Local Services                         582.0
Home & Garden                          569.0
Tuscan                                 551.0
Home Services                          542.0
Candy Stores                           522.0
Gas Stations                           501.0
Eatertainment                          489.0
Delicatessen                           475.0
Professional Services                  471.0
Russian                                468.0
Adult Entertainment                    467.0
Salvadoran                             467.0
Gift Shops                             459.0
Hair Salons                            453.0
International Grocery                  451.0
Health & Medical                       429.0
Moroccan                               426.0
Social Clubs                           407.0
Cafeteria                              381.0
Piano Bars                             376.0
Pretzels                               364.0
Hungarian                              360.0
Playgrounds                            352.0
Bed & Breakfast                        351.0
Ethnic Grocery                         349.0
Public Markets                         345.0
Speakeasies                            344.0
Beverage Store                         341.0
Art Galleries                          333.0
Hakka                                  326.0
Egyptian                               324.0
Pets                                   323.0
Puerto Rican                           310.0
Museums                                307.0
Ukrainian                              306.0
Macarons                               304.0
Education                              303.0
Wineries                               301.0
Empanadas                              300.0
Beer Garden                            291.0
Dinner Theater                         289.0
International                          288.0
Shaved Snow                            284.0
Nail Salons                            283.0
Arts & Crafts                          280.0
Indonesian                             265.0
Airports                               265.0
Personal Chefs                         263.0
Country Dance Halls                    263.0
Outlet Stores                          262.0
Musicians                              259.0
Public Services & Government           249.0
Hong Kong Style Cafe                   247.0
Real Estate                            245.0
Hostels                                245.0
Themed Cafes                           241.0
Syrian                                 236.0
Donairs                                236.0
Employment Agencies                    232.0
Furniture Stores                       231.0
Swimming Pools                         229.0
Massage                                228.0
Cambodian                              213.0
Department Stores                      213.0
Tours                                  211.0
Distilleries                           198.0
Dominican                              198.0
Mini Golf                              195.0
Zoos                                   195.0
Hair Removal                           191.0
Scandinavian                           183.0
Armenian                               179.0
Florists                               178.0
Shanghainese                           177.0
Festivals                              173.0
Waxing                                 172.0
Drugstores                             171.0
Kombucha                               171.0
Cooking Classes                        170.0
Tabletop Games                         170.0
Eyelash Service                        169.0
Personal Shopping                      168.0
Bangladeshi                            165.0
Nail Technicians                       165.0
Botanical Gardens                      165.0
Gay Bars                               163.0
Appliances                             160.0
Fitness & Instruction                  158.0
Towing                                 157.0
Barbers                                156.0
Wholesale Stores                       154.0
Sri Lankan                             154.0
Home Decor                             152.0
Landmarks & Historical Buildings       151.0
Nutritionists                          148.0
Kitchen & Bath                         146.0
Community Service/Non-Profit           145.0
Basque                                 137.0
Tiki Bars                              137.0
South African                          127.0
Wedding Chapels                        119.0
Books                                  117.0
Music & Video                          117.0
Mags                                   117.0
Antiques                               112.0
Sporting Goods                         110.0
Animal Shelters                        110.0
Skin Care                              110.0
Stadiums & Arenas                      109.0
Pop-Up Restaurants                     106.0
Guamanian                              106.0
Food Tours                             105.0
Tobacco Shops                          100.0
(306,)
In [66]:
train_test_set['categories'] = train_test_set['categories'].str.findall(main_categories)
train_test_set['categories'] = train_test_set['categories'].map(lambda x: set(x))
train_test_set['categories'] = train_test_set['categories'].map(lambda x: ['Other'] if not bool(x) else list(x))
train_test_set['categories'] = train_test_set['categories'].map(', '.join) 
train_test_set.head()
Out[66]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant OutdoorSeating BusinessAcceptsCreditCards RestaurantsDelivery RestaurantsReservations WiFi Alcohol categories city Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.622302 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 T2P 0K5 51.049673 -114.079977 24 4.0 True None False None None Beer&Wine Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.846154 18 0 1 8 4 4 4 13.0 4.222222 9.0 3.798692 11.168551 3.556721 5.00000 3.686094 3.777956 3.857143 3.684951 3.000000 5.000000 3.868171 2.000000 3.542997 5.000000 3.662669 3.749776 5.000000 3.66752 3.000000 5.000000 3.851784 2.000000 3.555679 5.000000 3.678871 3.770170 3.926822 3.676024 2.866472 5.000000 3.867280 2.000000
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 T2P 0K5 51.049673 -114.079977 24 4.0 True None False None None Beer&Wine Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.333333 18 0 0 1 1 1 454 3.0 3.333333 3.0 3.231620 2.702970 3.556721 3.79608 4.000000 3.777956 5.000000 2.500000 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 4.000000 3.749776 5.000000 2.50000 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 4.000000 3.770170 5.000000 2.353400 3.788204 3.928912 3.867280 3.767263
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 T2P 0K5 51.049673 -114.079977 24 4.0 True None False None None Beer&Wine Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 2.571429 27 0 0 31 3 3 46 14.0 2.692308 13.0 2.692835 12.704916 2.500000 1.00000 3.000000 3.777956 3.200000 3.684951 3.789846 1.000000 2.333333 3.770015 2.500000 1.000000 3.000000 3.749776 3.200000 3.66752 3.771654 1.000000 3.000000 3.744434 2.505665 1.000000 2.990709 3.770170 3.204491 3.676024 3.788204 1.000000 2.974502 3.767263
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 1 0.988395 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 T2P 0K5 51.049673 -114.079977 24 4.0 True None False None None Beer&Wine Mexican Calgary 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 NaT 20:00:00 20:00:00 20:00:00 20:00:00 20:00:00 04:00:00 NaT 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.703313 2 0 0 1 0 0 622 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 L4B 3P7 43.841694 -79.399755 44 3.0 False True False True No Beer&Wine Chinese Richmond Hill 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 11:00:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 22:30:00 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.000000 2.980769 212 3 23 307 157 71 2902 104.0 3.000000 88.0 2.997169 88.020993 2.966667 4.00000 3.686094 2.000000 3.040000 2.600000 3.600000 2.500000 1.000000 3.750000 2.923077 4.000000 3.662669 1.666667 3.146341 2.60000 3.750000 2.500000 1.000000 3.666667 2.954478 4.000000 3.678871 1.826533 3.115334 2.630356 3.621347 2.500236 1.000000 3.828739

Now we apply the actual conversion

In [67]:
train_test_set.shape
Out[67]:
(712379, 93)
In [68]:
cat_cols = ['OutdoorSeating', 'BusinessAcceptsCreditCards', 'RestaurantsDelivery', 'RestaurantsReservations', 'WiFi',
        'Alcohol', 'city']
train_test_set = _pd.get_dummies(train_test_set, columns=cat_cols, prefix=cat_cols)
train_test_set.shape
Out[68]:
(712379, 281)
In [69]:
categories = train_test_set['categories'].str.get_dummies(',')
f1 = lambda x: "categories_" + x
categories.rename(columns=f1, inplace=True)
train_test_set[categories.columns] = categories
train_test_set.drop(columns=['categories'], inplace=True)
train_test_set.shape
Out[69]:
(712379, 787)
In [70]:
oe = _OrdinalEncoder()
In [71]:
ord_cols = ['Monday_Open', 'Tuesday_Open', 'Wednesday_Open', 'Thursday_Open', 'Friday_Open',
            'Saturday_Open', 'Sunday_Open', 'Monday_Close', 'Tuesday_Close', 'Wednesday_Close',
            'Thursday_Close','Friday_Close', 'Saturday_Close', 'Sunday_Close', 'postal_code']

train_test_set[ord_cols] = oe.fit_transform(train_test_set[ord_cols].to_numpy())

The resulting dataset

In [72]:
train_test_set.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 712379 entries, 0 to 153992
Columns: 787 entries, review_id to categories_Wraps
dtypes: float64(68), int32(1), int64(520), object(3), uint8(195)
memory usage: 3.3+ GB
In [73]:
train_set = train_test_set[:_train_len]
test_set = train_test_set[_train_len:]
In [74]:
train_set.head(10)
Out[74]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Allison Park city_Amherst city_Aurora city_Avon city_Avondale city_Beachwood city_Bellevue city_Belmont city_Berea city_Bethel Park city_Blue Diamond city_Boulder City city_Braddock city_Brampton city_Brecksville city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklyn city_Brossard city_Brunswick city_Buckeye city_Calgary city_Canonsburg city_Carefree city_Carnegie city_Cave Creek city_Chagrin Falls city_Champaign city_Chandler city_Chardon city_Charlotte city_Chesterland city_Cleveland city_Clover city_Concord city_Coraopolis city_Cornelius city_Cuyahoga Falls city_Davidson city_Denver city_Dollard-des-Ormeaux city_Dorval city_East York city_El Mirage city_Elyria city_Etobicoke city_Euclid city_Fairlawn city_Fairview Park city_Fitchburg city_Fort Mill city_Fountain Hills city_Gastonia city_Gilbert city_Glendale city_Goodyear city_Harrisburg city_Henderson city_Highland Heights city_Homestead city_Hudson city_Huntersville city_Independence city_Indian Land city_Indian Trail city_Irwin city_Kannapolis city_Kent city_Lake Wylie city_Lakewood city_Las Vegas city_Laval city_Laveen city_Litchfield Park city_Longueuil city_Lorain city_Lyndhurst city_Macedonia city_Madison city_Maple city_Markham city_Matthews city_Mayfield Heights city_McKees Rocks city_McMurray city_Medina city_Mentor city_Mesa city_Middleburg Heights city_Middleton city_Mint Hill city_Mississauga city_Monona city_Monroe city_Monroeville city_Montreal city_Montréal city_Moon Township city_Mooresville city_Mount Holly city_Murrysville city_New Kensington city_Newmarket city_North Las Vegas city_North Olmsted city_North Ridgeville city_North Royalton city_North York city_Northfield city_Oakmont city_Oakville city_Olmsted Falls city_Orange city_Orange Village city_Other city_Painesville city_Paradise Valley city_Parma city_Peoria city_Phoenix city_Pickering city_Pineville city_Pittsburgh city_Pointe-Claire city_Queen Creek city_Richmond Hill city_Rock Hill city_Rocky River city_Saint-Laurent city_Scarborough city_Scottsdale city_Seven Hills city_Sewickley city_Solon city_South Euclid city_South Las Vegas city_Spring Valley city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sun City city_Sun Prairie city_Surprise city_Tega Cay city_Tempe city_Thornhill city_Tolleson city_Toronto city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Urbana city_Valley View city_Vaughan city_Verdun city_Verona city_Warrensville Heights city_Waunakee city_Waxhaw city_West Mifflin city_Westlake city_Westmount city_Wexford city_Whitby city_Willoughby city_Woodbridge city_Woodmere city_York categories_ Acai Bowls categories_ Active Life categories_ Adult Entertainment categories_ Afghan categories_ African categories_ Airports categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Shelters categories_ Antiques categories_ Appliances categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Galleries categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Automotive categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Barbeque categories_ Barbers categories_ Bars categories_ Basque categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Belgian categories_ Beverage Store categories_ Bistros categories_ Books categories_ Botanical Gardens categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ British categories_ Bubble Tea categories_ Buffets categories_ Burgers categories_ Butcher categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Canadian (New) categories_ Candy Stores categories_ Cantonese categories_ Car Wash categories_ Caribbean categories_ Casinos categories_ Caterers categories_ Cheese Shops categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chocolatiers & Shops categories_ Cinema categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee Roasteries categories_ Colombian categories_ Comfort Food categories_ Community Service/Non-Profit categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Country Dance Halls categories_ Creperies categories_ Cuban categories_ Cupcakes categories_ Custom Cakes categories_ Dance Clubs categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Department Stores categories_ Desserts categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Distilleries categories_ Dive Bars categories_ Do-It-Yourself Food categories_ Dominican categories_ Donairs categories_ Donuts categories_ Drugstores categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Empanadas categories_ Employment Agencies categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyelash Service categories_ Falafel categories_ Farmers Market categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Fish & Chips categories_ Fitness & Instruction categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ French categories_ Fruits & Veggies categories_ Furniture Stores categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ German categories_ Gift Shops categories_ Gluten-Free categories_ Golf categories_ Greek categories_ Grocery categories_ Guamanian categories_ Hair Removal categories_ Hair Salons categories_ Hakka categories_ Halal categories_ Hawaiian categories_ Health & Medical categories_ Health Markets categories_ Himalayan/Nepalese categories_ Home & Garden categories_ Home Decor categories_ Home Services categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Ice Cream & Frozen Yogurt categories_ Imported Food categories_ Indian categories_ Indonesian categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Italian categories_ Izakaya categories_ Japanese categories_ Jazz & Blues categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kitchen & Bath categories_ Kombucha categories_ Korean categories_ Kosher categories_ Landmarks & Historical Buildings categories_ Laotian categories_ Latin American categories_ Lebanese categories_ Live/Raw Food categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Mags categories_ Malaysian categories_ Massage categories_ Meat Shops categories_ Mediterranean categories_ Mexican categories_ Middle Eastern categories_ Mini Golf categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Museums categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nightlife categories_ Noodles categories_ Nutritionists categories_ Organic Stores categories_ Outlet Stores categories_ Pakistani categories_ Pan Asian categories_ Party & Event Planning categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Chefs categories_ Personal Shopping categories_ Peruvian categories_ Pets categories_ Piano Bars categories_ Pizza categories_ Playgrounds categories_ Poke categories_ Polish categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Portuguese categories_ Poutineries categories_ Pretzels categories_ Professional Services categories_ Public Markets categories_ Public Services & Government categories_ Pubs categories_ Puerto Rican categories_ Ramen categories_ Real Estate categories_ Resorts categories_ Russian categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Seafood categories_ Shanghainese categories_ Shaved Ice categories_ Shaved Snow categories_ Shopping categories_ Singaporean categories_ Skin Care categories_ Smokehouse categories_ Social Clubs categories_ Soul Food categories_ Soup categories_ Southern categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Sporting Goods categories_ Sports Bars categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Vendors categories_ Sushi Bars categories_ Swimming Pools categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Taiwanese categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tea Rooms categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Tiki Bars categories_ Tobacco Shops categories_ Tours categories_ Towing categories_ Turkish categories_ Tuscan categories_ Ukrainian categories_ Vegan categories_ Vegetarian categories_ Venezuelan categories_ Venues & Event Spaces categories_ Vietnamese categories_ Waffles categories_ Waxing categories_ Wedding Chapels categories_ Wedding Planning categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wine & Spirits categories_ Wine Bars categories_ Wineries categories_ Wraps categories_ Zoos categories_Acai Bowls categories_Active Life categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Appliances categories_Arabian categories_Arcades categories_Argentine categories_Armenian categories_Arts & Entertainment categories_Asian Fusion categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Barbeque categories_Bars categories_Basque categories_Beauty & Spas categories_Beer categories_Belgian categories_Beverage Store categories_Bistros categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_Brewpubs categories_British categories_Bubble Tea categories_Buffets categories_Burgers categories_Butcher categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Cambodian categories_Canadian (New) categories_Cantonese categories_Caribbean categories_Caterers categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Coffee & Tea categories_Coffee Roasteries categories_Colombian categories_Comfort Food categories_Convenience Stores categories_Cooking Classes categories_Creperies categories_Cuban categories_Custom Cakes categories_Day Spas categories_Delicatessen categories_Delis categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Do-It-Yourself Food categories_Donuts categories_Education categories_Ethiopian categories_Ethnic Grocery categories_Event Planning & Services categories_Falafel categories_Farmers Market categories_Fashion categories_Fast Food categories_Filipino categories_Fish & Chips categories_Fondue categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Tours categories_Food Trucks categories_French categories_Furniture Stores categories_Gas Stations categories_Gastropubs categories_Gelato categories_German categories_Gluten-Free categories_Greek categories_Grocery categories_Hair Removal categories_Hair Salons categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Himalayan/Nepalese categories_Home & Garden categories_Home Decor categories_Home Services categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Italian categories_Izakaya categories_Japanese categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Korean categories_Kosher categories_Landmarks & Historical Buildings categories_Laotian categories_Latin American categories_Lebanese categories_Live/Raw Food categories_Local Flavor categories_Local Services categories_Mags categories_Malaysian categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Modern European categories_Mongolian categories_Moroccan categories_Music Venues categories_Nail Salons categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Organic Stores categories_Other categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Party & Event Planning categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Peruvian categories_Pets categories_Pizza categories_Poke categories_Polish categories_Pop-Up Restaurants categories_Portuguese categories_Poutineries categories_Public Markets categories_Public Services & Government categories_Ramen categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Seafood categories_Shanghainese categories_Shaved Ice categories_Shaved Snow categories_Shopping categories_Singaporean categories_Smokehouse categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Specialty Food categories_Sri Lankan categories_Steakhouses categories_Street Vendors categories_Sushi Bars categories_Szechuan categories_Tabletop Games categories_Tacos categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Tex-Mex categories_Thai categories_Tobacco Shops categories_Towing categories_Turkish categories_Ukrainian categories_Vegan categories_Vegetarian categories_Venezuelan categories_Venues & Event Spaces categories_Vietnamese categories_Waffles categories_Wedding Planning categories_Whiskey Bars categories_Wholesale Stores categories_Wineries categories_Wraps
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.622302 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 7778.0 51.049673 -114.079977 24 4.0 29.0 26.0 30.0 29.0 27.0 25.0 61.0 51.0 52.0 51.0 58.0 58.0 17.0 70.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.846154 18 0 1 8 4 4 4 13.0 4.222222 9.0 3.798692 11.168551 3.556721 5.00000 3.686094 3.777956 3.857143 3.684951 3.000000 5.000000 3.868171 2.000000 3.542997 5.000000 3.662669 3.749776 5.000000 3.66752 3.000000 5.000000 3.851784 2.000000 3.555679 5.000000 3.678871 3.770170 3.926822 3.676024 2.866472 5.000000 3.867280 2.000000 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 7778.0 51.049673 -114.079977 24 4.0 29.0 26.0 30.0 29.0 27.0 25.0 61.0 51.0 52.0 51.0 58.0 58.0 17.0 70.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.333333 18 0 0 1 1 1 454 3.0 3.333333 3.0 3.231620 2.702970 3.556721 3.79608 4.000000 3.777956 5.000000 2.500000 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 4.000000 3.749776 5.000000 2.50000 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 4.000000 3.770170 5.000000 2.353400 3.788204 3.928912 3.867280 3.767263 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 7778.0 51.049673 -114.079977 24 4.0 29.0 26.0 30.0 29.0 27.0 25.0 61.0 51.0 52.0 51.0 58.0 58.0 17.0 70.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 2.571429 27 0 0 31 3 3 46 14.0 2.692308 13.0 2.692835 12.704916 2.500000 1.00000 3.000000 3.777956 3.200000 3.684951 3.789846 1.000000 2.333333 3.770015 2.500000 1.000000 3.000000 3.749776 3.200000 3.66752 3.771654 1.000000 3.000000 3.744434 2.505665 1.000000 2.990709 3.770170 3.204491 3.676024 3.788204 1.000000 2.974502 3.767263 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 1 0.988395 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 7778.0 51.049673 -114.079977 24 4.0 29.0 26.0 30.0 29.0 27.0 25.0 61.0 51.0 52.0 51.0 58.0 58.0 17.0 70.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.703313 2 0 0 1 0 0 622 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.000000 2.980769 212 3 23 307 157 71 2902 104.0 3.000000 88.0 2.997169 88.020993 2.966667 4.00000 3.686094 2.000000 3.040000 2.600000 3.600000 2.500000 1.000000 3.750000 2.923077 4.000000 3.662669 1.666667 3.146341 2.60000 3.750000 2.500000 1.000000 3.666667 2.954478 4.000000 3.678871 1.826533 3.115334 2.630356 3.621347 2.500236 1.000000 3.828739 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
5 rPNt-m2pdt-OR_NMQnHjCQ tKIihU81IA3NjpsADuR-Tg --6MefnULPED_I942VcFNA 5 3 0 2 1 0.907921 4.400000 4.400000 4.395327 4.376569 4.405801 4.313458 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 4.111111 230 2 12 166 70 32 2326 9.0 4.111111 9.0 4.132161 8.602213 4.400000 3.79608 3.686094 3.777956 4.500000 3.684951 3.789846 1.000000 3.868171 5.000000 4.400000 3.763461 3.662669 3.749776 4.500000 3.66752 3.771654 1.000000 3.851784 5.000000 4.395327 3.789966 3.678871 3.770170 4.481197 3.676024 3.788204 1.000000 3.867280 5.000000 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
6 Kg582pH05mZO_E6WS8PrKA XNOs3Wz1Q_zdRgm1Hy05fg --6MefnULPED_I942VcFNA 1 2 2 1 -1 0.184327 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 1 0 0 2 1 2 1462 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
7 gkW6_UqV9b2XI_5ae8rBCg HSHuSCJvIvf_Tof62uZPEw --6MefnULPED_I942VcFNA 2 2 1 0 -1 0.687126 1.800000 1.692308 1.768813 1.816353 1.733574 1.724494 0 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 2.543478 87 1 5 96 34 65 2158 46.0 2.581395 43.0 2.580489 38.588153 1.800000 3.00000 3.686094 3.166667 2.875000 2.875000 3.000000 3.933014 1.000000 2.000000 1.692308 3.000000 3.662669 3.166667 3.142857 2.87500 3.000000 3.904608 1.000000 2.000000 1.768813 3.166279 3.678871 3.153689 3.005042 2.823432 2.991269 3.928912 1.000000 2.006375 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
8 02voOwsYf0cEdKNzt5IkwA yvpX68yurPsope6KhBZrYA --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 1 0.935335 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 23 0 0 23 8 2 286 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
9 M67I-I5ATaqtVLtKZTgygw gvh8bvei5vwfoIYbNIvNDQ --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 -1 0.242671 3.000000 3.000000 3.040677 3.007972 3.009842 3.062459 1 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 2.600000 21 0 0 6 0 3 4 5.0 2.666667 3.0 2.680455 3.015285 3.556721 2.00000 3.686094 3.777956 3.000000 2.500000 3.789846 3.933014 3.868171 3.770015 3.542997 2.000000 3.662669 3.749776 3.000000 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 2.000000 3.678871 3.770170 3.040677 2.562281 3.788204 3.928912 3.867280 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In [75]:
train_set.shape
Out[75]:
(558386, 787)
In [76]:
train_set.to_pickle('../dataset/m2_n9/model_train_set_3.pickle')
In [77]:
test_set.head(10)
Out[77]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Allison Park city_Amherst city_Aurora city_Avon city_Avondale city_Beachwood city_Bellevue city_Belmont city_Berea city_Bethel Park city_Blue Diamond city_Boulder City city_Braddock city_Brampton city_Brecksville city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklyn city_Brossard city_Brunswick city_Buckeye city_Calgary city_Canonsburg city_Carefree city_Carnegie city_Cave Creek city_Chagrin Falls city_Champaign city_Chandler city_Chardon city_Charlotte city_Chesterland city_Cleveland city_Clover city_Concord city_Coraopolis city_Cornelius city_Cuyahoga Falls city_Davidson city_Denver city_Dollard-des-Ormeaux city_Dorval city_East York city_El Mirage city_Elyria city_Etobicoke city_Euclid city_Fairlawn city_Fairview Park city_Fitchburg city_Fort Mill city_Fountain Hills city_Gastonia city_Gilbert city_Glendale city_Goodyear city_Harrisburg city_Henderson city_Highland Heights city_Homestead city_Hudson city_Huntersville city_Independence city_Indian Land city_Indian Trail city_Irwin city_Kannapolis city_Kent city_Lake Wylie city_Lakewood city_Las Vegas city_Laval city_Laveen city_Litchfield Park city_Longueuil city_Lorain city_Lyndhurst city_Macedonia city_Madison city_Maple city_Markham city_Matthews city_Mayfield Heights city_McKees Rocks city_McMurray city_Medina city_Mentor city_Mesa city_Middleburg Heights city_Middleton city_Mint Hill city_Mississauga city_Monona city_Monroe city_Monroeville city_Montreal city_Montréal city_Moon Township city_Mooresville city_Mount Holly city_Murrysville city_New Kensington city_Newmarket city_North Las Vegas city_North Olmsted city_North Ridgeville city_North Royalton city_North York city_Northfield city_Oakmont city_Oakville city_Olmsted Falls city_Orange city_Orange Village city_Other city_Painesville city_Paradise Valley city_Parma city_Peoria city_Phoenix city_Pickering city_Pineville city_Pittsburgh city_Pointe-Claire city_Queen Creek city_Richmond Hill city_Rock Hill city_Rocky River city_Saint-Laurent city_Scarborough city_Scottsdale city_Seven Hills city_Sewickley city_Solon city_South Euclid city_South Las Vegas city_Spring Valley city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sun City city_Sun Prairie city_Surprise city_Tega Cay city_Tempe city_Thornhill city_Tolleson city_Toronto city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Urbana city_Valley View city_Vaughan city_Verdun city_Verona city_Warrensville Heights city_Waunakee city_Waxhaw city_West Mifflin city_Westlake city_Westmount city_Wexford city_Whitby city_Willoughby city_Woodbridge city_Woodmere city_York categories_ Acai Bowls categories_ Active Life categories_ Adult Entertainment categories_ Afghan categories_ African categories_ Airports categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Shelters categories_ Antiques categories_ Appliances categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Galleries categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Automotive categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Barbeque categories_ Barbers categories_ Bars categories_ Basque categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Belgian categories_ Beverage Store categories_ Bistros categories_ Books categories_ Botanical Gardens categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ British categories_ Bubble Tea categories_ Buffets categories_ Burgers categories_ Butcher categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Canadian (New) categories_ Candy Stores categories_ Cantonese categories_ Car Wash categories_ Caribbean categories_ Casinos categories_ Caterers categories_ Cheese Shops categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chocolatiers & Shops categories_ Cinema categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee Roasteries categories_ Colombian categories_ Comfort Food categories_ Community Service/Non-Profit categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Country Dance Halls categories_ Creperies categories_ Cuban categories_ Cupcakes categories_ Custom Cakes categories_ Dance Clubs categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Department Stores categories_ Desserts categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Distilleries categories_ Dive Bars categories_ Do-It-Yourself Food categories_ Dominican categories_ Donairs categories_ Donuts categories_ Drugstores categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Empanadas categories_ Employment Agencies categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyelash Service categories_ Falafel categories_ Farmers Market categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Fish & Chips categories_ Fitness & Instruction categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ French categories_ Fruits & Veggies categories_ Furniture Stores categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ German categories_ Gift Shops categories_ Gluten-Free categories_ Golf categories_ Greek categories_ Grocery categories_ Guamanian categories_ Hair Removal categories_ Hair Salons categories_ Hakka categories_ Halal categories_ Hawaiian categories_ Health & Medical categories_ Health Markets categories_ Himalayan/Nepalese categories_ Home & Garden categories_ Home Decor categories_ Home Services categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Ice Cream & Frozen Yogurt categories_ Imported Food categories_ Indian categories_ Indonesian categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Italian categories_ Izakaya categories_ Japanese categories_ Jazz & Blues categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kitchen & Bath categories_ Kombucha categories_ Korean categories_ Kosher categories_ Landmarks & Historical Buildings categories_ Laotian categories_ Latin American categories_ Lebanese categories_ Live/Raw Food categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Mags categories_ Malaysian categories_ Massage categories_ Meat Shops categories_ Mediterranean categories_ Mexican categories_ Middle Eastern categories_ Mini Golf categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Museums categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nightlife categories_ Noodles categories_ Nutritionists categories_ Organic Stores categories_ Outlet Stores categories_ Pakistani categories_ Pan Asian categories_ Party & Event Planning categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Chefs categories_ Personal Shopping categories_ Peruvian categories_ Pets categories_ Piano Bars categories_ Pizza categories_ Playgrounds categories_ Poke categories_ Polish categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Portuguese categories_ Poutineries categories_ Pretzels categories_ Professional Services categories_ Public Markets categories_ Public Services & Government categories_ Pubs categories_ Puerto Rican categories_ Ramen categories_ Real Estate categories_ Resorts categories_ Russian categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Seafood categories_ Shanghainese categories_ Shaved Ice categories_ Shaved Snow categories_ Shopping categories_ Singaporean categories_ Skin Care categories_ Smokehouse categories_ Social Clubs categories_ Soul Food categories_ Soup categories_ Southern categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Sporting Goods categories_ Sports Bars categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Vendors categories_ Sushi Bars categories_ Swimming Pools categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Taiwanese categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tea Rooms categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Tiki Bars categories_ Tobacco Shops categories_ Tours categories_ Towing categories_ Turkish categories_ Tuscan categories_ Ukrainian categories_ Vegan categories_ Vegetarian categories_ Venezuelan categories_ Venues & Event Spaces categories_ Vietnamese categories_ Waffles categories_ Waxing categories_ Wedding Chapels categories_ Wedding Planning categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wine & Spirits categories_ Wine Bars categories_ Wineries categories_ Wraps categories_ Zoos categories_Acai Bowls categories_Active Life categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Appliances categories_Arabian categories_Arcades categories_Argentine categories_Armenian categories_Arts & Entertainment categories_Asian Fusion categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Barbeque categories_Bars categories_Basque categories_Beauty & Spas categories_Beer categories_Belgian categories_Beverage Store categories_Bistros categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_Brewpubs categories_British categories_Bubble Tea categories_Buffets categories_Burgers categories_Butcher categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Cambodian categories_Canadian (New) categories_Cantonese categories_Caribbean categories_Caterers categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Coffee & Tea categories_Coffee Roasteries categories_Colombian categories_Comfort Food categories_Convenience Stores categories_Cooking Classes categories_Creperies categories_Cuban categories_Custom Cakes categories_Day Spas categories_Delicatessen categories_Delis categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Do-It-Yourself Food categories_Donuts categories_Education categories_Ethiopian categories_Ethnic Grocery categories_Event Planning & Services categories_Falafel categories_Farmers Market categories_Fashion categories_Fast Food categories_Filipino categories_Fish & Chips categories_Fondue categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Tours categories_Food Trucks categories_French categories_Furniture Stores categories_Gas Stations categories_Gastropubs categories_Gelato categories_German categories_Gluten-Free categories_Greek categories_Grocery categories_Hair Removal categories_Hair Salons categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Himalayan/Nepalese categories_Home & Garden categories_Home Decor categories_Home Services categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Italian categories_Izakaya categories_Japanese categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Korean categories_Kosher categories_Landmarks & Historical Buildings categories_Laotian categories_Latin American categories_Lebanese categories_Live/Raw Food categories_Local Flavor categories_Local Services categories_Mags categories_Malaysian categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Modern European categories_Mongolian categories_Moroccan categories_Music Venues categories_Nail Salons categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Organic Stores categories_Other categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Party & Event Planning categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Peruvian categories_Pets categories_Pizza categories_Poke categories_Polish categories_Pop-Up Restaurants categories_Portuguese categories_Poutineries categories_Public Markets categories_Public Services & Government categories_Ramen categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Seafood categories_Shanghainese categories_Shaved Ice categories_Shaved Snow categories_Shopping categories_Singaporean categories_Smokehouse categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Specialty Food categories_Sri Lankan categories_Steakhouses categories_Street Vendors categories_Sushi Bars categories_Szechuan categories_Tabletop Games categories_Tacos categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Tex-Mex categories_Thai categories_Tobacco Shops categories_Towing categories_Turkish categories_Ukrainian categories_Vegan categories_Vegetarian categories_Venezuelan categories_Venues & Event Spaces categories_Vietnamese categories_Waffles categories_Wedding Planning categories_Whiskey Bars categories_Wholesale Stores categories_Wineries categories_Wraps
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 1 0.997555 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 67 1 1 16 6 2 3574 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 -1 0.553523 1.800000 2.000000 1.799679 1.838975 2.007105 1.777964 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 2.875000 86 0 1 34 9 12 166 16.0 3.000000 15.0 2.909282 13.563981 1.800000 2.20000 3.686094 4.500000 3.333333 1.000000 3.000000 3.933014 3.868171 3.770015 2.000000 2.500000 3.662669 4.500000 3.333333 1.00000 3.000000 3.904608 3.851784 3.744434 1.799679 2.460018 3.678871 4.571695 3.355656 1.000000 2.924220 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 1 0.990602 4.300000 4.333333 4.299574 4.349620 4.302949 4.288981 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 4.108108 227 2 7 99 47 30 286 37.0 4.161290 31.0 4.130733 31.326167 4.300000 3.75000 3.686094 3.500000 4.454545 3.666667 3.800000 3.933014 4.000000 4.000000 4.333333 3.750000 3.662669 3.000000 4.555556 3.50000 4.000000 3.904608 4.000000 4.000000 4.299574 3.724926 3.678871 3.340936 4.538601 3.626374 3.946442 3.928912 4.00000 4.000000 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 1 0.968214 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 1 0 0 0 0 0 2110 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 1 0.995667 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 59 0 0 5 0 1 22 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
5 NYG9z-whhsV99RbDR4KPWQ 3l-Rmqcw_Cm1mTxlqEmLEQ --9e1ONYQuAa-CB_Rrw7Tw 5 1 0 5 1 0.123990 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 17 0 0 7 7 1 4 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
6 14tBa-RMPWBP93y6KcOyQQ 2vjNw6qpyvXAqRhSPzmHtQ --9e1ONYQuAa-CB_Rrw7Tw 2 1 0 0 1 0.530155 3.000000 3.704594 3.000000 2.995295 3.722907 3.009147 0 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 2.500000 105 0 0 31 12 19 4 2.0 2.000000 1.0 2.258306 1.292847 3.556721 3.79608 3.686094 3.777956 3.000000 2.000000 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 2.00000 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.000000 2.000000 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
7 A-n5xtGMR5Frz2KPJTfRzw WiY9q-Jz42huWzq90fgAWA --9e1ONYQuAa-CB_Rrw7Tw 2 0 0 0 -1 0.474595 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 4 0 1 0 0 0 4 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
8 DqmGICsMu16YttevGUZCjg q9q9nVaTYz7tScwZLHNO3A --9e1ONYQuAa-CB_Rrw7Tw 5 0 1 0 1 0.900049 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 5 0 0 2 0 1 4 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
9 pSHJpti6SfYIK3XhHzvrXg q_sv4HEU4XM88x9z6WG-Tw --9e1ONYQuAa-CB_Rrw7Tw 2 0 0 0 -1 0.487348 4.000000 4.000000 4.000000 4.001269 4.001652 4.015008 0 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.000000 10 0 0 4 3 0 22 2.0 4.000000 1.0 3.820433 0.773019 3.556721 3.79608 3.686094 2.000000 4.000000 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 4.000000 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 2.000000 4.000000 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In [78]:
test_set.shape
Out[78]:
(153993, 787)
In [79]:
test_set.to_pickle('../dataset/m2_n9/model_test_set_3.pickle')
In [80]:
_del_all()

6. Models

6.1 Linear SVM

(see the docs)

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.head()
Out[3]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Allison Park city_Amherst city_Aurora city_Avon city_Avondale city_Beachwood city_Bellevue city_Belmont city_Berea city_Bethel Park city_Blue Diamond city_Boulder City city_Braddock city_Brampton city_Brecksville city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklyn city_Brossard city_Brunswick city_Buckeye city_Calgary city_Canonsburg city_Carefree city_Carnegie city_Cave Creek city_Chagrin Falls city_Champaign city_Chandler city_Chardon city_Charlotte city_Chesterland city_Cleveland city_Clover city_Concord city_Coraopolis city_Cornelius city_Cuyahoga Falls city_Davidson city_Denver city_Dollard-des-Ormeaux city_Dorval city_East York city_El Mirage city_Elyria city_Etobicoke city_Euclid city_Fairlawn city_Fairview Park city_Fitchburg city_Fort Mill city_Fountain Hills city_Gastonia city_Gilbert city_Glendale city_Goodyear city_Harrisburg city_Henderson city_Highland Heights city_Homestead city_Hudson city_Huntersville city_Independence city_Indian Land city_Indian Trail city_Irwin city_Kannapolis city_Kent city_Lake Wylie city_Lakewood city_Las Vegas city_Laval city_Laveen city_Litchfield Park city_Longueuil city_Lorain city_Lyndhurst city_Macedonia city_Madison city_Maple city_Markham city_Matthews city_Mayfield Heights city_McKees Rocks city_McMurray city_Medina city_Mentor city_Mesa city_Middleburg Heights city_Middleton city_Mint Hill city_Mississauga city_Monona city_Monroe city_Monroeville city_Montreal city_Montréal city_Moon Township city_Mooresville city_Mount Holly city_Murrysville city_New Kensington city_Newmarket city_North Las Vegas city_North Olmsted city_North Ridgeville city_North Royalton city_North York city_Northfield city_Oakmont city_Oakville city_Olmsted Falls city_Orange city_Orange Village city_Other city_Painesville city_Paradise Valley city_Parma city_Peoria city_Phoenix city_Pickering city_Pineville city_Pittsburgh city_Pointe-Claire city_Queen Creek city_Richmond Hill city_Rock Hill city_Rocky River city_Saint-Laurent city_Scarborough city_Scottsdale city_Seven Hills city_Sewickley city_Solon city_South Euclid city_South Las Vegas city_Spring Valley city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sun City city_Sun Prairie city_Surprise city_Tega Cay city_Tempe city_Thornhill city_Tolleson city_Toronto city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Urbana city_Valley View city_Vaughan city_Verdun city_Verona city_Warrensville Heights city_Waunakee city_Waxhaw city_West Mifflin city_Westlake city_Westmount city_Wexford city_Whitby city_Willoughby city_Woodbridge city_Woodmere city_York categories_ Acai Bowls categories_ Active Life categories_ Adult Entertainment categories_ Afghan categories_ African categories_ Airports categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Shelters categories_ Antiques categories_ Appliances categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Galleries categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Automotive categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Barbeque categories_ Barbers categories_ Bars categories_ Basque categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Belgian categories_ Beverage Store categories_ Bistros categories_ Books categories_ Botanical Gardens categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ British categories_ Bubble Tea categories_ Buffets categories_ Burgers categories_ Butcher categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Canadian (New) categories_ Candy Stores categories_ Cantonese categories_ Car Wash categories_ Caribbean categories_ Casinos categories_ Caterers categories_ Cheese Shops categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chocolatiers & Shops categories_ Cinema categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee Roasteries categories_ Colombian categories_ Comfort Food categories_ Community Service/Non-Profit categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Country Dance Halls categories_ Creperies categories_ Cuban categories_ Cupcakes categories_ Custom Cakes categories_ Dance Clubs categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Department Stores categories_ Desserts categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Distilleries categories_ Dive Bars categories_ Do-It-Yourself Food categories_ Dominican categories_ Donairs categories_ Donuts categories_ Drugstores categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Empanadas categories_ Employment Agencies categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyelash Service categories_ Falafel categories_ Farmers Market categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Fish & Chips categories_ Fitness & Instruction categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ French categories_ Fruits & Veggies categories_ Furniture Stores categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ German categories_ Gift Shops categories_ Gluten-Free categories_ Golf categories_ Greek categories_ Grocery categories_ Guamanian categories_ Hair Removal categories_ Hair Salons categories_ Halal categories_ Hawaiian categories_ Health & Medical categories_ Health Markets categories_ Himalayan/Nepalese categories_ Home & Garden categories_ Home Decor categories_ Home Services categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Ice Cream & Frozen Yogurt categories_ Imported Food categories_ Indian categories_ Indonesian categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Italian categories_ Izakaya categories_ Japanese categories_ Jazz & Blues categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kitchen & Bath categories_ Korean categories_ Kosher categories_ Landmarks & Historical Buildings categories_ Laotian categories_ Latin American categories_ Lebanese categories_ Live/Raw Food categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Mags categories_ Malaysian categories_ Massage categories_ Meat Shops categories_ Mediterranean categories_ Mexican categories_ Middle Eastern categories_ Mini Golf categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Museums categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nightlife categories_ Noodles categories_ Nutritionists categories_ Organic Stores categories_ Outlet Stores categories_ Pakistani categories_ Pan Asian categories_ Party & Event Planning categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Chefs categories_ Personal Shopping categories_ Peruvian categories_ Pets categories_ Piano Bars categories_ Pizza categories_ Playgrounds categories_ Poke categories_ Polish categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Portuguese categories_ Poutineries categories_ Pretzels categories_ Professional Services categories_ Public Markets categories_ Public Services & Government categories_ Pubs categories_ Puerto Rican categories_ Ramen categories_ Real Estate categories_ Resorts categories_ Russian categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Seafood categories_ Shanghainese categories_ Shaved Ice categories_ Shaved Snow categories_ Shopping categories_ Singaporean categories_ Skin Care categories_ Smokehouse categories_ Social Clubs categories_ Soul Food categories_ Soup categories_ South African categories_ Southern categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Sporting Goods categories_ Sports Bars categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Vendors categories_ Sushi Bars categories_ Swimming Pools categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Taiwanese categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tea Rooms categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Tiki Bars categories_ Tobacco Shops categories_ Tours categories_ Towing categories_ Turkish categories_ Tuscan categories_ Ukrainian categories_ Vegan categories_ Vegetarian categories_ Venezuelan categories_ Venues & Event Spaces categories_ Vietnamese categories_ Waffles categories_ Waxing categories_ Wedding Planning categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wine & Spirits categories_ Wine Bars categories_ Wineries categories_ Wraps categories_ Zoos categories_Acai Bowls categories_Active Life categories_Adult Entertainment categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Amusement Parks categories_Arabian categories_Arcades categories_Argentine categories_Armenian categories_Art Galleries categories_Arts & Crafts categories_Arts & Entertainment categories_Asian Fusion categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Barbeque categories_Bars categories_Beauty & Spas categories_Belgian categories_Bistros categories_Books categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_British categories_Bubble Tea categories_Buffets categories_Burgers categories_Butcher categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Cambodian categories_Canadian (New) categories_Candy Stores categories_Cantonese categories_Car Wash categories_Caribbean categories_Casinos categories_Caterers categories_Cheese Shops categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Cinema categories_Cocktail Bars categories_Coffee & Tea categories_Coffee Roasteries categories_Comfort Food categories_Community Service/Non-Profit categories_Convenience Stores categories_Conveyor Belt Sushi categories_Cooking Classes categories_Creperies categories_Cuban categories_Cupcakes categories_Custom Cakes categories_Dance Clubs categories_Day Spas categories_Delicatessen categories_Delis categories_Department Stores categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Do-It-Yourself Food categories_Dominican categories_Donairs categories_Donuts categories_Drugstores categories_Education categories_Egyptian categories_Ethiopian categories_Ethnic Food categories_Ethnic Grocery categories_Event Planning & Services categories_Farmers Market categories_Fashion categories_Fast Food categories_Festivals categories_Filipino categories_Fish & Chips categories_Fitness & Instruction categories_Florists categories_Flowers & Gifts categories_Fondue categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Trucks categories_French categories_Fruits & Veggies categories_Furniture Stores categories_Gas Stations categories_Gastropubs categories_German categories_Gift Shops categories_Gluten-Free categories_Golf categories_Greek categories_Grocery categories_Hair Removal categories_Hair Salons categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Health Markets categories_Himalayan/Nepalese categories_Home & Garden categories_Home Services categories_Hong Kong Style Cafe categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Italian categories_Izakaya categories_Japanese categories_Jazz & Blues categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Kids Activities categories_Kitchen & Bath categories_Kombucha categories_Korean categories_Kosher categories_Landmarks & Historical Buildings categories_Latin American categories_Live/Raw Food categories_Local Flavor categories_Lounges categories_Macarons categories_Malaysian categories_Meat Shops categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Modern European categories_Mongolian categories_Moroccan categories_Music Venues categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Nutritionists categories_Other categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Party & Event Planning categories_Pasta Shops categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Personal Chefs categories_Personal Shopping categories_Peruvian categories_Pets categories_Pizza categories_Poke categories_Polish categories_Pool Halls categories_Portuguese categories_Poutineries categories_Pretzels categories_Professional Services categories_Public Markets categories_Public Services & Government categories_Pubs categories_Puerto Rican categories_Real Estate categories_Resorts categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Scandinavian categories_Seafood categories_Shanghainese categories_Shaved Ice categories_Shopping categories_Singaporean categories_Smokehouse categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Specialty Food categories_Sri Lankan categories_Steakhouses categories_Street Vendors categories_Sushi Bars categories_Szechuan categories_Tabletop Games categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Teppanyaki categories_Tex-Mex categories_Thai categories_Tiki Bars categories_Tobacco Shops categories_Tours categories_Turkish categories_Ukrainian categories_Vegan categories_Vegetarian categories_Venezuelan categories_Venues & Event Spaces categories_Vietnamese categories_Waffles categories_Wedding Chapels categories_Wholesale Stores categories_Wine & Spirits categories_Wineries categories_Wraps categories_Zoos
0 ---HLAnHbuLi7vd5TL6uYg zyp8SaRnZ94sWZpLrifS1Q l6xZVTEtdZAvNpL1JhYGuw 4 0 0 0 -1 0.595406 3.726715 3.704594 3.719363 -0.062602 -0.064146 -0.054028 1 2434.0 45.569980 -73.199634 4 5.0 45.0 41.0 45.0 44.0 42.0 40.0 45.0 59.0 61.0 60.0 71.0 71.0 70.0 61.0 5.000000 3.0 5.000000 3.0 5.000000 1.746082 0.009869 3.000000 10 0 0 2 0 0 22 3.0 3.000000 3.0 3.022921 2.723338 3.556721 3.79608 3.686094 3.777956 3.000000 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.000000 3.667520 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.022921 3.676024 3.788204 3.928912 3.867280 3.767263 0 1 0 0 1 0 0 1 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 ---L4b6VR6HoB-q7cfMWIA 697iJkhX1mkVF9RNhn114Q XiXu6WHbDoopKpeg7DfKdQ 5 3 1 2 1 0.993384 3.013889 3.014925 3.039444 -0.507850 -0.525615 -0.455618 1 4057.0 43.579150 -79.683305 51 3.5 15.0 14.0 17.0 16.0 14.0 14.0 16.0 59.0 61.0 60.0 67.0 36.0 35.0 61.0 3.521739 46.0 3.540541 37.0 3.495062 36.481444 0.009869 3.157143 236 5 25 809 331 162 3238 140.0 3.152672 131.0 3.158813 126.185462 3.833333 2.87500 4.000000 3.250000 3.013889 3.125000 3.333333 3.400000 2.666667 2.666667 3.823529 2.600000 4.000000 3.250000 3.014925 3.066667 3.333333 3.400000 2.400000 2.666667 3.821347 2.726907 4.000000 3.303025 3.039444 3.145065 2.847918 3.387395 2.572254 2.752899 0 0 1 0 1 0 1 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 ---sPYSgArT4Sd5v1nDVMQ iVSuN8PrtKVtLzhNiu23uA OumGHdbdp7WgyYMhcAdjhw 1 0 0 0 1 0.519254 3.726715 3.704594 3.719363 -0.062602 -0.064146 -0.054028 0 493.0 33.608745 -112.359880 190 2.5 29.0 26.0 30.0 29.0 27.0 25.0 27.0 59.0 61.0 60.0 67.0 71.0 70.0 61.0 2.821229 179.0 2.814286 140.0 2.817668 133.926571 0.009869 3.703313 1 0 0 0 0 0 4 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.667520 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 0 0 1 0 0 1 1 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 --0SzSMXVUoAXfackNoB4g v9P7J6hWWtIblnylQ5UBfA iCQpiavjjPzJ5_3gPD5Ebg 5 0 0 0 1 0.983368 1.000000 1.000000 1.000000 -3.139748 -3.132280 -3.125614 1 530.0 36.109837 -115.174212 4286 4.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 17.0 14.0 15.0 15.0 21.0 21.0 17.0 4.139748 3814.0 4.132280 3417.0 4.125614 3319.856699 0.009869 1.000000 73 0 1 41 17 32 9382 2.0 1.000000 1.0 1.000000 1.134894 3.556721 3.79608 3.686094 1.000000 1.000000 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 1.000000 3.667520 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 1.000000 1.000000 3.676024 3.788204 3.928912 3.867280 3.767263 1 0 0 0 0 1 1 0 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 --1JMhPk6K9fZo4FOp_yMw 2xZ1mHP14as5RJ1KOrVU4A QJatAcxYgK1Zp9BRZMAx7g 2 0 0 0 1 0.866956 2.750000 2.666667 2.748879 -1.212807 -1.267752 -1.191953 0 530.0 36.103061 -115.173450 2844 4.0 29.0 26.0 30.0 25.0 23.0 25.0 29.0 0.0 9.0 9.0 67.0 9.0 9.0 8.0 3.962807 2608.0 3.934419 2272.0 3.940832 2229.696159 0.009869 3.625000 548 5 10 273 87 46 1654 8.0 3.714286 7.0 3.621997 7.577900 3.556721 3.79608 4.000000 4.500000 5.000000 2.750000 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 4.000000 4.500000 5.000000 2.666667 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 4.000000 4.507847 5.000000 2.748879 3.788204 3.928912 3.867280 3.767263 0 0 1 0 0 1 1 0 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In [4]:
#sub_train_set = train_set[:round(train_set.shape[0]/3)]
sub_train_set = train_set
del train_set
sub_train_set.shape
Out[4]:
(558386, 817)
In [5]:
# define classifier
svc_classifier = _LinearSVC(random_state=0, max_iter=50000)
svc_classifier.get_params()
Out[5]:
{'C': 1.0,
 'class_weight': None,
 'dual': True,
 'fit_intercept': True,
 'intercept_scaling': 1,
 'loss': 'squared_hinge',
 'max_iter': 50000,
 'multi_class': 'ovr',
 'penalty': 'l2',
 'random_state': 0,
 'tol': 0.0001,
 'verbose': 0}
In [6]:
# fine tune classifier
# param_grid = {'C':[0.001,0.01,0.1,0.25,0.5,0.75,1,10,100,1000], 'gamma':[3,2,1,0.1,0.001,0.0001]}
param_grid = {'C':[0.001,0.01,0.1,0.25,0.5,0.75,1,10,100,1000]}
# grid = _GridSearchCV(estimator=svc_classifier, param_grid=param_grid, refit=True, verbose=2, cv=3, error_score=_np.nan, n_jobs=1, pre_dispatch=1)
grid = _GridSearchCV(estimator=svc_classifier, param_grid=param_grid, refit=True, verbose=2, cv=3, error_score=_np.nan, n_jobs=-1, pre_dispatch=6)
grid.fit(sub_train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), sub_train_set['likes'])
print("best params:", grid.best_params_, "- best score:", grid.best_score_)
Fitting 3 folds for each of 10 candidates, totalling 30 fits
[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.
[Parallel(n_jobs=-1)]: Done  30 out of  30 | elapsed: 1654.8min finished
best params: {'C': 0.001} - best score: 0.7436808945783024
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\sklearn\svm\base.py:929: ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.
  "the number of iterations.", ConvergenceWarning)
In [7]:
print("results:", grid.cv_results_)
results: {'mean_fit_time': array([19900.99567477, 19961.755548  , 19843.87850094, 19883.72761846,
       19875.85472822, 19892.886506  , 19819.18020558, 19831.93277272,
       19672.39210947, 19633.8438464 ]), 'std_fit_time': array([11.92322164, 24.47909948, 11.8002273 ,  2.51810785, 18.78170298,
       16.00907826,  5.86167115, 32.23873155,  4.59971461, 24.49313577]), 'mean_score_time': array([ 2.63927579,  5.25461706,  5.47336515, 10.05810801,  5.17248909,
        3.84837699,  2.75297205,  2.87099091,  2.33608754,  1.75242146]), 'std_score_time': array([0.28703641, 3.56587212, 1.62402285, 4.61803914, 2.93643879,
       1.57086447, 0.04191375, 0.27929337, 0.24098951, 0.2409035 ]), 'param_C': masked_array(data=[0.001, 0.01, 0.1, 0.25, 0.5, 0.75, 1, 10, 100, 1000],
             mask=[False, False, False, False, False, False, False, False,
                   False, False],
       fill_value='?',
            dtype=object), 'params': [{'C': 0.001}, {'C': 0.01}, {'C': 0.1}, {'C': 0.25}, {'C': 0.5}, {'C': 0.75}, {'C': 1}, {'C': 10}, {'C': 100}, {'C': 1000}], 'split0_test_score': array([0.74414656, 0.68984581, 0.71937893, 0.57199269, 0.71992156,
       0.70982109, 0.59223124, 0.63179498, 0.63454575, 0.63445441]), 'split1_test_score': array([0.74462735, 0.7388034 , 0.70227478, 0.59544507, 0.49747486,
       0.62558025, 0.65421645, 0.68013947, 0.66420958, 0.66396781]), 'split2_test_score': array([0.74226876, 0.72382984, 0.55883048, 0.61401831, 0.47335167,
       0.52100705, 0.69141129, 0.52799149, 0.52916273, 0.5987976 ]), 'mean_test_score': array([0.74368089, 0.71749292, 0.66016161, 0.59381861, 0.56358326,
       0.61880312, 0.6459528 , 0.61330871, 0.60930611, 0.63240661]), 'std_test_score': array([0.00101764, 0.020483  , 0.07199118, 0.0171954 , 0.11098627,
       0.0772319 , 0.04090958, 0.06347462, 0.0579493 , 0.02664495]), 'rank_test_score': array([ 1,  2,  3,  9, 10,  6,  4,  7,  8,  5])}
In [8]:
del sub_train_set
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
In [5]:
best_model = grid.best_estimator_
best_model.set_params(verbose=5)
best_model.get_params()
Out[5]:
{'C': 0.001,
 'class_weight': None,
 'dual': True,
 'fit_intercept': True,
 'intercept_scaling': 1,
 'loss': 'squared_hinge',
 'max_iter': 50000,
 'multi_class': 'ovr',
 'penalty': 'l2',
 'random_state': 0,
 'tol': 0.0001,
 'verbose': 5}
In [9]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\sklearn\svm\base.py:929: ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.
  "the number of iterations.", ConvergenceWarning)
Out[9]:
LinearSVC(C=0.001, class_weight=None, dual=True, fit_intercept=True,
          intercept_scaling=1, loss='squared_hinge', max_iter=50000,
          multi_class='ovr', penalty='l2', random_state=0, tol=0.0001,
          verbose=0)
In [10]:
print("coef:", best_model.coef_)
print("intercept:", best_model.intercept_)
coef: [[-1.43167078e-01 -2.03906910e-01  3.42779107e-01 -1.04728908e-02
  -1.47427540e-01 -2.43472370e-02 -9.26808435e-03 -1.17709493e-02
   5.58234365e-02  1.83726109e-02 -7.91658762e-03  7.49998933e-05
  -3.29092790e-02  7.08513173e-03 -4.64203658e-05  6.24335229e-01
  -4.68312019e-05  1.13214836e-03  1.87475912e-04 -8.54227633e-04
  -4.47446153e-04 -7.61677236e-04  1.13923954e-03 -1.91138083e-04
  -1.40924026e-04  4.50836058e-04 -7.61383085e-05  5.97650915e-05
   2.98315416e-05 -2.74831993e-04 -6.26183842e-02  5.30331089e-05
   5.84330552e-03  2.03076934e-04 -3.48781537e-02 -2.18059846e-04
  -4.02326827e-02  4.95315660e-02  3.39012708e-04  1.71206420e-02
  -8.71542990e-04 -2.61593855e-04  1.30803827e-04  1.18480402e-04
  -4.80787054e-06 -5.59617233e-03  1.61222017e-02  1.03926925e-04
   4.67182627e-03  5.79030033e-03  3.74660499e-02  1.02789779e-02
   4.51314090e-02  1.57159326e-02  2.23464027e-02  5.36922329e-02
   2.34749232e-02  2.86587882e-02  1.87929188e-02  3.83956193e-04
  -7.22989332e-03  1.95846857e-03  1.82299621e-03  1.13746379e-03
   5.45619480e-03  1.58677886e-02 -1.95511703e-02 -1.28162672e-02
  -1.96649484e-03 -1.30449955e-03 -8.07076837e-04  1.18903675e-02
  -1.11566651e-02  2.26932299e-02  2.51582368e-02 -1.87668524e-02
   7.85823969e-03  7.73799384e-03  3.85827228e-03  4.22093591e-02
  -1.19005102e-01 -8.59189500e-02 -1.24239801e-01 -7.40342751e-02
  -1.19770059e-01 -1.35359519e-01 -1.20415320e-01 -1.05257346e-01
  -1.03491187e-01 -1.14153703e-01 -1.08812496e-01 -1.06197655e-01
  -8.22293134e-02 -8.11447278e-02 -8.84894517e-02 -7.73003600e-02
  -8.49805052e-02 -8.51293957e-02 -8.21027373e-02 -7.69512147e-02
   6.43923766e-03  1.85595762e-02  1.23813296e-02  2.25462539e-02
   2.14381375e-02 -2.21330989e-02  1.39767176e-02  3.28450365e-02
  -5.99616006e-02  4.27423193e-02  1.95000974e-02 -7.25494945e-02
  -1.51045288e-02  3.20011680e-02  2.51968573e-02  6.58713427e-02
  -9.68960933e-03 -7.67558814e-02 -1.31860158e-02 -3.30225057e-02
  -1.29565963e-03 -1.22750226e-03  1.32295606e-02  2.79374411e-02
   3.17061198e-03  4.61489025e-03  3.31614833e-01 -6.92978644e-03
   1.75307888e-02 -8.55135160e-03  2.19597610e-02  1.50690523e-02
   4.00123219e-02 -1.79436834e-02  2.37032470e-02 -1.74807415e-01
   2.08777300e-02  5.59448146e-02 -2.29348176e-02 -1.07672262e-01
   1.36945435e-03 -7.92229306e-02  2.30166566e-02 -2.07064407e-02
  -5.24655802e-02  1.29004963e-02  6.08488450e-03  1.29702210e-02
  -1.07647542e-02  1.95574537e-02 -6.24734869e-02  1.80610497e-02
   1.38307854e-02  7.02867332e-03  3.03232666e-02 -1.18367338e-01
   6.66244523e-03 -1.01049018e-01 -2.26197994e-02  2.09783558e-03
  -9.12480404e-03 -1.16218404e-02  6.37714957e-02  2.12609828e-02
   2.93436435e-02  1.84281145e-02 -8.92567984e-02  2.20758890e-02
  -3.20071304e-02 -7.40689432e-02  1.19849983e-02 -4.26474567e-02
   3.20709228e-02 -1.48892296e-02  5.93652875e-02  8.70463297e-02
   1.68511943e-02  1.08955283e-02  1.15257076e-02 -1.13721061e-02
   4.61982521e-03 -5.38403883e-03  1.59303884e-02  1.21825364e-01
  -1.43542730e-02 -1.70066160e-02 -1.28769520e-01 -1.08798946e-02
  -2.89941634e-02  6.84521535e-03  4.20171560e-02  6.75656292e-02
  -2.54647582e-02  2.03317828e-02  1.13335713e-01 -2.34771344e-02
  -3.49674158e-02  2.87547751e-02 -1.93415639e-02 -3.62824126e-02
  -1.92791682e-03  5.32535767e-02 -1.42032229e-02 -2.13908527e-02
  -8.23161692e-02  1.29157539e-02 -2.13631810e-02 -1.48627573e-02
   7.81676486e-02 -1.76906490e-02  2.07163109e-02  5.49162626e-03
  -8.29327282e-02  3.36895336e-02 -1.16082899e-02 -5.93015570e-02
   1.42680472e-02 -8.77579215e-03  1.86299090e-03 -3.94405655e-03
  -5.30170043e-03 -2.23817348e-02  6.64656794e-02 -1.46158840e-02
  -7.95928591e-03  1.54178914e-02 -1.08412578e-01 -1.14603015e-02
   1.15143000e-02  3.07233178e-03 -1.39648332e-02 -8.49316438e-02
   9.31174193e-03  1.82254249e-02 -4.93272251e-02 -1.52058544e-02
  -3.68582782e-04 -2.93260038e-02  4.72522780e-02 -1.72815316e-02
   4.17304527e-02  8.30807812e-03 -6.71044760e-03  3.67725198e-02
   3.40228374e-02 -9.29651532e-03 -1.14372482e-02  4.50867347e-02
   1.25293732e-02 -5.20908397e-02 -3.11803374e-02 -3.66637724e-02
  -3.40471406e-02 -8.13300220e-02  5.83104557e-03 -6.15754552e-03
  -3.83566701e-03 -3.85224902e-03  2.56349472e-02  3.97811814e-03
  -4.23112661e-02 -1.22968513e-02  6.85746928e-02  1.49594730e-02
   2.42777204e-02 -4.57823769e-02 -4.68553937e-02  8.88235693e-03
  -2.69917143e-02  6.05564616e-03  3.70784601e-03  6.36277093e-03
  -1.96343197e-02  2.50612973e-02 -8.80448869e-03 -4.39302037e-02
  -2.70567179e-02  1.28431973e-03  1.82585550e-02 -2.99224173e-02
  -3.76426445e-02 -1.21557100e-02  6.10713723e-04  6.84306098e-03
   6.13247044e-03  1.48782541e-02  1.67792883e-02  5.42475598e-03
   1.67573925e-02 -7.66087771e-04 -1.00430850e-02 -2.13891203e-02
  -8.00217272e-03 -1.27645522e-02 -2.72593723e-03  5.15850376e-03
  -5.24853616e-02  2.93798828e-03  1.16753929e-02 -6.64935227e-05
  -1.01851634e-02 -2.53813056e-02  4.83795523e-04  3.51256411e-02
  -9.64920258e-03  3.62038039e-03 -6.03079323e-02  6.03129133e-03
   1.82823595e-03  4.00928974e-03  1.93399507e-02  1.64871125e-02
  -9.22545284e-04  1.45307420e-02  2.93889782e-03  1.46391268e-03
   4.21477946e-03  1.95149473e-02  1.59151458e-02 -2.90799407e-02
  -5.32425108e-03  5.13371472e-04  6.58654665e-03  2.21270704e-02
  -2.92310377e-03  1.87968959e-02  6.41981957e-02 -5.36022064e-02
   1.64240343e-02 -7.36316514e-03  1.74012276e-02 -3.43569952e-02
  -3.33796058e-02 -8.25664541e-03  7.68784960e-03  1.79893762e-02
  -1.23231879e-02  3.42456239e-03 -8.12921342e-02  1.18766670e-03
   7.81862807e-03 -1.61176409e-03 -7.26807030e-03  5.20989335e-02
  -2.21053123e-02 -1.06174103e-02  2.09679791e-03 -2.27182601e-02
   1.28459985e-03 -1.60854286e-02  4.08880864e-02  8.12019316e-03
   4.77563249e-02 -2.39699682e-02  1.54578864e-02 -8.06455222e-03
  -3.49687057e-02  7.48646031e-03 -1.01357608e-02  1.46504891e-02
   8.00686052e-03  8.40864088e-03 -6.75660634e-03  5.74274899e-03
   6.26943831e-02  5.51697698e-03 -8.23761484e-04 -1.00452237e-03
   7.99939968e-03 -2.46815672e-02  2.66978365e-02  2.21327073e-02
  -8.80816123e-03  7.18237663e-03  6.57952294e-03  5.02076932e-02
  -3.91555575e-03 -5.57597968e-03  1.46251006e-02  5.72630443e-03
   1.16652072e-02 -1.53468443e-02 -1.92508581e-03 -5.60256986e-02
  -1.12202980e-02  6.80762675e-02  4.41100854e-02 -7.29487438e-03
  -1.31907835e-03 -4.49189138e-02  6.87885210e-03  9.98582527e-03
  -2.25761715e-02  3.39400124e-03  2.20140659e-03 -1.60394518e-02
   2.53522844e-02  2.66762932e-03 -5.18357602e-03  1.56388295e-02
  -1.79071588e-03  2.77794683e-02 -3.67316202e-02 -1.42449595e-03
  -6.40470001e-02 -1.55041448e-02  1.13679417e-02 -2.17145211e-02
  -7.53775575e-03  1.22737060e-02  8.65845652e-04 -3.08407608e-03
   4.86664233e-03 -1.53867109e-02 -5.17985113e-03 -2.03651799e-02
   1.91980806e-02  4.92713278e-03 -6.53381963e-03 -1.81574415e-02
   9.65854970e-05 -3.01428875e-02  1.26714661e-02 -1.76560612e-02
   6.68903647e-03 -9.93572804e-03 -3.58821993e-02  8.32218218e-03
  -4.36526108e-03  2.48863141e-02  3.61363130e-02  5.07821855e-03
  -3.90056078e-03  9.36203437e-03 -3.29313219e-02 -3.11253544e-02
   1.15892173e-02 -5.27904735e-02 -1.62300116e-03  2.91537406e-02
  -3.29570759e-02  3.94202889e-03 -2.01852252e-02 -4.79670044e-03
   7.28264419e-04  7.18623583e-03  7.90700878e-03  2.44060085e-02
   1.00612486e-03 -2.23966525e-02 -3.79289197e-03 -1.10582143e-02
   1.46512937e-02  2.44656784e-02 -1.15407013e-02 -5.53300392e-03
   2.42208289e-03 -4.41613655e-02  6.48285154e-03  4.13872823e-02
   9.99939903e-03 -1.18352048e-02 -2.40933106e-02  2.26273085e-02
  -1.33455480e-03  3.15784295e-02 -2.48895846e-03  1.22946390e-02
   2.42208289e-03 -1.44142302e-02 -8.74927111e-04 -3.20937165e-03
   7.39171882e-03 -1.02028844e-02  2.26375269e-02  3.31014263e-03
  -8.40897442e-04  2.45346628e-02 -6.63887082e-02 -2.90423744e-03
   3.04557214e-02 -1.90037208e-02 -2.61113659e-02  3.80622415e-02
   7.87264893e-03  3.26674265e-03  2.17321140e-02  1.02955744e-02
   2.83061855e-03  2.63610086e-02  7.77730787e-02  1.05384436e-02
   3.17608021e-03 -2.73779197e-03  5.11149650e-03  2.87392776e-03
  -1.01253861e-03 -1.08366221e-03 -1.30241635e-03  1.20216849e-02
   1.39569501e-02 -1.99490256e-02  1.35320484e-02  8.05746616e-03
  -8.24048119e-03  2.61876051e-02  5.26147434e-03 -1.21678933e-03
   1.17283157e-02 -1.29200471e-02  1.47292929e-02  1.81511161e-03
  -4.18561338e-03  2.72806095e-03  1.84774669e-03  2.32571224e-02
  -3.62586866e-03 -2.57413497e-02  4.02977372e-02  5.35851306e-03
  -3.06822235e-02  3.53570097e-02 -1.26496679e-02  5.80675783e-03
   4.60161659e-03 -1.23862180e-02  1.55171471e-02 -2.03109857e-02
   5.36861934e-03  1.44253929e-02 -1.55582018e-02 -3.00789639e-03
   7.09780537e-03  3.78598738e-03 -1.90397802e-03 -4.36166784e-03
  -1.61494752e-02 -9.57092816e-03  1.66469596e-02  2.02867846e-02
   8.74064562e-03 -3.69635089e-02  2.45981688e-02  2.44762415e-05
   2.65297612e-02 -5.65507611e-02  2.11100448e-02 -1.59385542e-02
   2.16523750e-02  8.24320396e-03 -1.19107432e-02 -2.65382701e-03
  -3.12885988e-02 -8.39971804e-04 -5.09638069e-02  3.70207971e-02
  -3.86831154e-03  7.34583067e-03 -7.80391160e-03  2.16666323e-02
   1.44281827e-02  2.34461164e-02  1.89871264e-02  2.49997453e-02
  -4.24011799e-02  3.00994599e-03  1.13842015e-03 -1.36375829e-02
  -2.86653979e-02  2.54946027e-02 -1.41063536e-02 -1.21862282e-03
   4.48972747e-03  6.65803173e-03 -2.69789800e-03 -4.13691953e-03
   2.13004642e-02 -1.60669410e-02 -9.51533592e-03 -5.01331052e-02
   7.61613960e-03 -1.04469781e-02 -7.31934059e-04 -2.81985342e-04
   2.19386377e-03  3.79455409e-03 -1.11748456e-03 -6.00952125e-03
  -9.91363845e-03 -5.09548564e-03 -2.44983072e-02 -1.72975672e-02
   9.50968844e-03 -3.62841711e-02 -4.55958794e-03  1.37125824e-03
   7.18545900e-03 -1.58720685e-03 -1.44475532e-02 -2.23748903e-02
  -4.05693727e-02  3.55632400e-02 -1.30620582e-03  5.38951802e-03
  -1.91137097e-02 -4.27591050e-02 -2.73569197e-02  1.27168276e-03
  -1.85378351e-02 -7.01738243e-04 -2.83759078e-02  8.08597820e-04
  -9.58997930e-03  1.23761410e-03  4.34167906e-03 -6.78296315e-03
   3.57030486e-02 -9.62271107e-03 -9.67085263e-03  0.00000000e+00
   3.24178422e-04  3.26201083e-02 -7.06086075e-02  1.30559437e-02
  -9.62224911e-02  1.89523982e-03  7.78398371e-03  4.54877039e-03
   5.78081958e-03  7.11326511e-03  4.15156887e-04 -1.91974270e-02
  -7.98532087e-03  2.41789074e-02 -4.11431639e-03  1.28425645e-03
  -7.57808090e-03  1.82470266e-02  3.62114623e-02  1.91141494e-02
   2.35279337e-02 -7.22572723e-04 -1.88539069e-03 -4.12193347e-02
   1.60796818e-02  1.12383544e-02  1.21170106e-04  4.12750278e-03
   2.16607546e-03 -1.66681897e-02  2.23429981e-03 -7.14857241e-03
  -4.52925993e-05  6.66764106e-03  6.24784033e-03 -2.78202903e-02
   1.58978262e-02  3.49680312e-03 -6.48520731e-02 -5.65323893e-02
  -1.90193092e-03  3.01207267e-02 -1.31315451e-03  9.92434559e-06
  -6.51746958e-04  1.95772688e-03  1.06466202e-02  8.25969551e-05
  -9.57170886e-03 -2.20494724e-04  9.06388435e-03  2.11596415e-02
   2.60474656e-02  2.26263006e-04  4.14867414e-03  9.75519473e-03
   1.56634480e-03  3.96228420e-03  4.81861835e-03 -1.34948169e-02
  -5.77011371e-03  1.29384089e-03  1.19237926e-02 -1.14615709e-02
  -8.75009192e-03  2.14285082e-02 -4.17197412e-02 -2.44163879e-02
  -3.26237878e-02 -5.81602061e-03 -8.90697174e-03  3.72281689e-03
  -1.25739551e-02 -2.37849114e-02 -1.89637209e-02 -4.30928889e-02
  -2.82602474e-02  1.10434587e-02 -1.18333579e-02  7.62912600e-03
   3.36174950e-03 -9.91674983e-03 -3.36842520e-03  6.13852406e-03
  -1.23463397e-02  2.70532920e-02  1.34658473e-03  2.58183749e-02
  -5.77527036e-03  5.68023344e-02 -9.19399529e-03  1.00286985e-03
   7.84130126e-03  0.00000000e+00 -3.01124948e-03  7.41680648e-03
  -2.78523695e-02 -2.16936666e-03  1.14407793e-03 -4.08355329e-02
   1.12714479e-03  1.00583568e-02 -2.67619252e-02  7.75870591e-03
   2.04462887e-02  1.39865985e-02  1.81644059e-02 -1.93742093e-02
   1.23097657e-02  2.59880695e-02  5.58663393e-04 -3.45414041e-03
   2.05564822e-02 -1.15355666e-02  1.41612339e-03  6.09386922e-03
  -1.14223166e-03 -6.27691093e-03  1.57887123e-03  1.56785695e-02
  -8.30917111e-03  5.98673265e-03 -1.42472019e-02 -4.01584093e-03
   8.80469174e-02  6.22115845e-03  2.91506854e-04 -6.94961622e-04
   3.03143045e-03  2.98517155e-04  3.92965342e-03  7.57468421e-03
   1.36820283e-02 -4.76946092e-03  6.67322670e-03 -2.88096241e-02
  -2.81311951e-02  9.81076193e-03 -2.15307441e-04  3.47471791e-04
   5.49986819e-03  3.01861022e-03 -2.62180172e-03 -4.43657911e-02
  -2.38224758e-02  8.25245727e-03 -1.44065480e-02 -2.50050522e-02
  -7.96490470e-03  1.67455640e-02  1.66711673e-02  7.48092760e-04
   1.81851874e-04  4.37139413e-02 -5.55381346e-04 -2.66707086e-03
  -2.66720264e-02  1.24151910e-02 -5.70328785e-03 -2.67261397e-02
   1.00455909e-03  1.49863493e-02  3.71619172e-03 -7.57665783e-03
  -1.82723099e-03 -8.07269119e-03  2.34465032e-02  1.33758856e-02
   8.01502638e-03 -1.85376857e-03  7.79391675e-03  5.96944686e-03
  -1.88244893e-03 -3.19623791e-03 -2.55225452e-02  1.51594674e-03
   1.68299266e-03  2.77837254e-03  9.18093058e-03 -7.20618269e-03
   2.63524670e-02 -2.19232243e-02  2.98468545e-02  3.63779680e-02
   3.28010985e-02  4.33930745e-03 -1.08360486e-02 -4.29211655e-04
  -1.37964806e-02  0.00000000e+00  2.40683045e-02  2.07455480e-02]]
intercept: [-0.32916385]
In [3]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set.head()
Out[3]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Allison Park city_Amherst city_Aurora city_Avon city_Avondale city_Beachwood city_Bellevue city_Belmont city_Berea city_Bethel Park city_Blue Diamond city_Boulder City city_Braddock city_Brampton city_Brecksville city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklyn city_Brossard city_Brunswick city_Buckeye city_Calgary city_Canonsburg city_Carefree city_Carnegie city_Cave Creek city_Chagrin Falls city_Champaign city_Chandler city_Chardon city_Charlotte city_Chesterland city_Cleveland city_Clover city_Concord city_Coraopolis city_Cornelius city_Cuyahoga Falls city_Davidson city_Denver city_Dollard-des-Ormeaux city_Dorval city_East York city_El Mirage city_Elyria city_Etobicoke city_Euclid city_Fairlawn city_Fairview Park city_Fitchburg city_Fort Mill city_Fountain Hills city_Gastonia city_Gilbert city_Glendale city_Goodyear city_Harrisburg city_Henderson city_Highland Heights city_Homestead city_Hudson city_Huntersville city_Independence city_Indian Land city_Indian Trail city_Irwin city_Kannapolis city_Kent city_Lake Wylie city_Lakewood city_Las Vegas city_Laval city_Laveen city_Litchfield Park city_Longueuil city_Lorain city_Lyndhurst city_Macedonia city_Madison city_Maple city_Markham city_Matthews city_Mayfield Heights city_McKees Rocks city_McMurray city_Medina city_Mentor city_Mesa city_Middleburg Heights city_Middleton city_Mint Hill city_Mississauga city_Monona city_Monroe city_Monroeville city_Montreal city_Montréal city_Moon Township city_Mooresville city_Mount Holly city_Murrysville city_New Kensington city_Newmarket city_North Las Vegas city_North Olmsted city_North Ridgeville city_North Royalton city_North York city_Northfield city_Oakmont city_Oakville city_Olmsted Falls city_Orange city_Orange Village city_Other city_Painesville city_Paradise Valley city_Parma city_Peoria city_Phoenix city_Pickering city_Pineville city_Pittsburgh city_Pointe-Claire city_Queen Creek city_Richmond Hill city_Rock Hill city_Rocky River city_Saint-Laurent city_Scarborough city_Scottsdale city_Seven Hills city_Sewickley city_Solon city_South Euclid city_South Las Vegas city_Spring Valley city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sun City city_Sun Prairie city_Surprise city_Tega Cay city_Tempe city_Thornhill city_Tolleson city_Toronto city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Urbana city_Valley View city_Vaughan city_Verdun city_Verona city_Warrensville Heights city_Waunakee city_Waxhaw city_West Mifflin city_Westlake city_Westmount city_Wexford city_Whitby city_Willoughby city_Woodbridge city_Woodmere city_York categories_ Acai Bowls categories_ Active Life categories_ Adult Entertainment categories_ Afghan categories_ African categories_ Airports categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Shelters categories_ Antiques categories_ Appliances categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Galleries categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Automotive categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Barbeque categories_ Barbers categories_ Bars categories_ Basque categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Belgian categories_ Beverage Store categories_ Bistros categories_ Books categories_ Botanical Gardens categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ British categories_ Bubble Tea categories_ Buffets categories_ Burgers categories_ Butcher categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Canadian (New) categories_ Candy Stores categories_ Cantonese categories_ Car Wash categories_ Caribbean categories_ Casinos categories_ Caterers categories_ Cheese Shops categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chocolatiers & Shops categories_ Cinema categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee Roasteries categories_ Colombian categories_ Comfort Food categories_ Community Service/Non-Profit categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Country Dance Halls categories_ Creperies categories_ Cuban categories_ Cupcakes categories_ Custom Cakes categories_ Dance Clubs categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Department Stores categories_ Desserts categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Distilleries categories_ Dive Bars categories_ Do-It-Yourself Food categories_ Dominican categories_ Donairs categories_ Donuts categories_ Drugstores categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Empanadas categories_ Employment Agencies categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyelash Service categories_ Falafel categories_ Farmers Market categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Fish & Chips categories_ Fitness & Instruction categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ French categories_ Fruits & Veggies categories_ Furniture Stores categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ German categories_ Gift Shops categories_ Gluten-Free categories_ Golf categories_ Greek categories_ Grocery categories_ Guamanian categories_ Hair Removal categories_ Hair Salons categories_ Halal categories_ Hawaiian categories_ Health & Medical categories_ Health Markets categories_ Himalayan/Nepalese categories_ Home & Garden categories_ Home Decor categories_ Home Services categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Ice Cream & Frozen Yogurt categories_ Imported Food categories_ Indian categories_ Indonesian categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Italian categories_ Izakaya categories_ Japanese categories_ Jazz & Blues categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kitchen & Bath categories_ Korean categories_ Kosher categories_ Landmarks & Historical Buildings categories_ Laotian categories_ Latin American categories_ Lebanese categories_ Live/Raw Food categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Mags categories_ Malaysian categories_ Massage categories_ Meat Shops categories_ Mediterranean categories_ Mexican categories_ Middle Eastern categories_ Mini Golf categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Museums categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nightlife categories_ Noodles categories_ Nutritionists categories_ Organic Stores categories_ Outlet Stores categories_ Pakistani categories_ Pan Asian categories_ Party & Event Planning categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Chefs categories_ Personal Shopping categories_ Peruvian categories_ Pets categories_ Piano Bars categories_ Pizza categories_ Playgrounds categories_ Poke categories_ Polish categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Portuguese categories_ Poutineries categories_ Pretzels categories_ Professional Services categories_ Public Markets categories_ Public Services & Government categories_ Pubs categories_ Puerto Rican categories_ Ramen categories_ Real Estate categories_ Resorts categories_ Russian categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Seafood categories_ Shanghainese categories_ Shaved Ice categories_ Shaved Snow categories_ Shopping categories_ Singaporean categories_ Skin Care categories_ Smokehouse categories_ Social Clubs categories_ Soul Food categories_ Soup categories_ South African categories_ Southern categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Sporting Goods categories_ Sports Bars categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Vendors categories_ Sushi Bars categories_ Swimming Pools categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Taiwanese categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tea Rooms categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Tiki Bars categories_ Tobacco Shops categories_ Tours categories_ Towing categories_ Turkish categories_ Tuscan categories_ Ukrainian categories_ Vegan categories_ Vegetarian categories_ Venezuelan categories_ Venues & Event Spaces categories_ Vietnamese categories_ Waffles categories_ Waxing categories_ Wedding Planning categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wine & Spirits categories_ Wine Bars categories_ Wineries categories_ Wraps categories_ Zoos categories_Acai Bowls categories_Active Life categories_Adult Entertainment categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Amusement Parks categories_Arabian categories_Arcades categories_Argentine categories_Armenian categories_Art Galleries categories_Arts & Crafts categories_Arts & Entertainment categories_Asian Fusion categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Barbeque categories_Bars categories_Beauty & Spas categories_Belgian categories_Bistros categories_Books categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_British categories_Bubble Tea categories_Buffets categories_Burgers categories_Butcher categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Cambodian categories_Canadian (New) categories_Candy Stores categories_Cantonese categories_Car Wash categories_Caribbean categories_Casinos categories_Caterers categories_Cheese Shops categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Cinema categories_Cocktail Bars categories_Coffee & Tea categories_Coffee Roasteries categories_Comfort Food categories_Community Service/Non-Profit categories_Convenience Stores categories_Conveyor Belt Sushi categories_Cooking Classes categories_Creperies categories_Cuban categories_Cupcakes categories_Custom Cakes categories_Dance Clubs categories_Day Spas categories_Delicatessen categories_Delis categories_Department Stores categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Do-It-Yourself Food categories_Dominican categories_Donairs categories_Donuts categories_Drugstores categories_Education categories_Egyptian categories_Ethiopian categories_Ethnic Food categories_Ethnic Grocery categories_Event Planning & Services categories_Farmers Market categories_Fashion categories_Fast Food categories_Festivals categories_Filipino categories_Fish & Chips categories_Fitness & Instruction categories_Florists categories_Flowers & Gifts categories_Fondue categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Trucks categories_French categories_Fruits & Veggies categories_Furniture Stores categories_Gas Stations categories_Gastropubs categories_German categories_Gift Shops categories_Gluten-Free categories_Golf categories_Greek categories_Grocery categories_Hair Removal categories_Hair Salons categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Health Markets categories_Himalayan/Nepalese categories_Home & Garden categories_Home Services categories_Hong Kong Style Cafe categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Italian categories_Izakaya categories_Japanese categories_Jazz & Blues categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Kids Activities categories_Kitchen & Bath categories_Kombucha categories_Korean categories_Kosher categories_Landmarks & Historical Buildings categories_Latin American categories_Live/Raw Food categories_Local Flavor categories_Lounges categories_Macarons categories_Malaysian categories_Meat Shops categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Modern European categories_Mongolian categories_Moroccan categories_Music Venues categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Nutritionists categories_Other categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Party & Event Planning categories_Pasta Shops categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Personal Chefs categories_Personal Shopping categories_Peruvian categories_Pets categories_Pizza categories_Poke categories_Polish categories_Pool Halls categories_Portuguese categories_Poutineries categories_Pretzels categories_Professional Services categories_Public Markets categories_Public Services & Government categories_Pubs categories_Puerto Rican categories_Real Estate categories_Resorts categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Scandinavian categories_Seafood categories_Shanghainese categories_Shaved Ice categories_Shopping categories_Singaporean categories_Smokehouse categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Specialty Food categories_Sri Lankan categories_Steakhouses categories_Street Vendors categories_Sushi Bars categories_Szechuan categories_Tabletop Games categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Teppanyaki categories_Tex-Mex categories_Thai categories_Tiki Bars categories_Tobacco Shops categories_Tours categories_Turkish categories_Ukrainian categories_Vegan categories_Vegetarian categories_Venezuelan categories_Venues & Event Spaces categories_Vietnamese categories_Waffles categories_Wedding Chapels categories_Wholesale Stores categories_Wine & Spirits categories_Wineries categories_Wraps categories_Zoos
0 ---j05qHS2X7FkXjjMKKtA E6Aoz-3s4avfweIjziHjbA cTbFJzHQzFSX-z3JF4abKQ 5 1 0 1 -1 0.229833 4.000000 4.285714 3.969038 0.674847 1.042857 0.678155 1 6145.0 43.670859 -79.393423 178 3.5 58.0 55.0 59.0 61.0 57.0 58.0 61.0 68.0 69.0 69.0 76.0 76.0 75.0 70.0 3.325153 163.0 3.242857 140.0 3.290882 138.108392 0.009869 3.826087 465 2 15 1943 1617 1452 7294 23.0 3.904762 21.0 3.816160 20.600898 3.556721 4.00000 3.686094 4.500000 3.444444 4.000000 3.789846 3.933014 3.868171 4.000000 3.542997 4.000000 3.662669 4.500000 3.444444 4.285714 3.771654 3.904608 3.851784 4.000000 3.555679 4.000000 3.678871 4.508384 3.446572 3.969038 3.788204 3.928912 3.867280 4.102936 0 0 1 0 1 0 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 --4GjusuUCMh24c_oh_cEg YOOsYiXGEtGFX_wSeZNcww JytR7WvKyytDQNwOHUzSEg 4 1 0 0 1 0.987487 3.604167 3.607143 3.610109 -0.099146 -0.096170 -0.093204 1 6429.0 43.650017 -79.389188 6 4.0 29.0 26.0 30.0 29.0 27.0 29.0 61.0 55.0 57.0 56.0 62.0 62.0 61.0 70.0 3.703313 0.0 3.703313 0.0 3.703313 0.000000 0.009869 3.610738 297 5 21 390 179 98 838 149.0 3.627907 129.0 3.617911 129.802281 3.750000 3.75000 4.000000 3.625000 3.604167 3.230769 2.000000 4.400000 3.666667 3.000000 3.750000 3.750000 4.666667 3.625000 3.607143 3.333333 2.000000 4.250000 3.666667 3.000000 3.756843 3.726028 4.287215 3.525072 3.610109 3.170261 2.000000 4.355007 3.664197 3.021022 0 1 0 0 1 0 0 1 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 --4RpVT5wHJ9AfnZkIC3tw Bdw4E8jFVd6-CbhrNAJ_EA -CfFjcCcGGDM9MVH_d42RQ 5 0 0 0 -1 0.524751 5.000000 3.704594 5.000000 1.156489 -0.064146 1.182474 1 460.0 33.276651 -111.874092 369 4.0 0.0 26.0 30.0 29.0 27.0 25.0 29.0 0.0 61.0 60.0 67.0 0.0 0.0 61.0 3.843511 262.0 3.738636 176.0 3.817526 190.319758 0.009869 3.666667 10 0 1 5 2 0 70 3.0 3.000000 2.0 3.403937 2.417682 3.556721 3.79608 1.000000 3.777956 5.000000 5.000000 3.789846 5.000000 3.868171 3.770015 3.542997 3.763461 1.000000 3.749776 5.000000 3.667520 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 1.000000 3.770170 5.000000 5.000000 3.788204 5.000000 3.867280 3.767263 0 0 1 0 0 1 1 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 --4vJzoC0m5h-yodXv-qCw jm4a1GghQ4zLCN3lQGMQUQ XMPBg6r_LqZhy9Cf-4ZJrA 2 0 0 0 1 0.979875 3.726715 3.704594 3.719363 -0.062602 -0.064146 -0.054028 0 246.0 41.464592 -81.841481 23 2.5 13.0 12.0 15.0 14.0 12.0 12.0 17.0 55.0 57.0 56.0 62.0 62.0 61.0 53.0 2.400000 20.0 2.235294 17.0 2.369097 17.460006 0.009869 3.703313 2 0 0 1 0 0 4 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.667520 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 0 0 1 0 0 1 0 0 1 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 --B9JxEb5gY5gAgD2BRhDQ x_6VdQU3CIdakwHod-dNzA Fn_IxcCtZl1EoS81sq_s9w 3 1 0 0 1 0.662471 3.726715 3.704594 3.719363 -0.062602 -0.064146 -0.054028 0 438.0 33.219015 -111.791210 142 3.5 0.0 37.0 41.0 40.0 27.0 25.0 29.0 0.0 57.0 56.0 62.0 67.0 66.0 54.0 3.567376 141.0 3.575472 106.0 3.562613 111.700460 0.009869 3.703313 3 0 0 1 0 0 574 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.667520 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 1 0 0 0 0 1 0 0 1 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In [5]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
predictions:
 [1 1 1 ... 1 1 1]
In [8]:
set(predic)
Out[8]:
{0, 1}
In [9]:
# evaluate classifier

print("Report for Support Vector Machine:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Support Vector Machine:", _accuracy_score(test_set['likes'], predic)*100)
Report for Support Vector Machine:
              precision    recall  f1-score   support

           0       0.70      0.36      0.48     50930
           1       0.74      0.92      0.82    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.64      0.65    153993
weighted avg       0.73      0.74      0.71    153993

Accuracy for Support Vector Machine: 73.72737721844499
In [10]:
# Confusion matrix for SVC

print("Confusion Matrix for SVC: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for SVC before balance the data: 
Out[10]:
array([[18347, 32583],
       [ 7875, 95188]], dtype=int64)
In [11]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("SVM ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [17]:
_jl.dump(best_model, "../models/best_SVM.joblib")
Out[17]:
['../models/best_SVM.joblib']
In [ ]:
_del_all()

6.2 Random Forest Classifier

(see the docs)

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.head()
Out[3]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Allison Park city_Amherst city_Aurora city_Avon city_Avondale city_Beachwood city_Bellevue city_Belmont city_Berea city_Bethel Park city_Blue Diamond city_Boulder City city_Braddock city_Brampton city_Brecksville city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklyn city_Brossard city_Brunswick city_Buckeye city_Calgary city_Canonsburg city_Carefree city_Carnegie city_Cave Creek city_Chagrin Falls city_Champaign city_Chandler city_Chardon city_Charlotte city_Chesterland city_Cleveland city_Clover city_Concord city_Coraopolis city_Cornelius city_Cuyahoga Falls city_Davidson city_Denver city_Dollard-des-Ormeaux city_Dorval city_East York city_El Mirage city_Elyria city_Etobicoke city_Euclid city_Fairlawn city_Fairview Park city_Fitchburg city_Fort Mill city_Fountain Hills city_Gastonia city_Gilbert city_Glendale city_Goodyear city_Harrisburg city_Henderson city_Highland Heights city_Homestead city_Hudson city_Huntersville city_Independence city_Indian Land city_Indian Trail city_Irwin city_Kannapolis city_Kent city_Lake Wylie city_Lakewood city_Las Vegas city_Laval city_Laveen city_Litchfield Park city_Longueuil city_Lorain city_Lyndhurst city_Macedonia city_Madison city_Maple city_Markham city_Matthews city_Mayfield Heights city_McKees Rocks city_McMurray city_Medina city_Mentor city_Mesa city_Middleburg Heights city_Middleton city_Mint Hill city_Mississauga city_Monona city_Monroe city_Monroeville city_Montreal city_Montréal city_Moon Township city_Mooresville city_Mount Holly city_Murrysville city_New Kensington city_Newmarket city_North Las Vegas city_North Olmsted city_North Ridgeville city_North Royalton city_North York city_Northfield city_Oakmont city_Oakville city_Olmsted Falls city_Orange city_Orange Village city_Other city_Painesville city_Paradise Valley city_Parma city_Peoria city_Phoenix city_Pickering city_Pineville city_Pittsburgh city_Pointe-Claire city_Queen Creek city_Richmond Hill city_Rock Hill city_Rocky River city_Saint-Laurent city_Scarborough city_Scottsdale city_Seven Hills city_Sewickley city_Solon city_South Euclid city_South Las Vegas city_Spring Valley city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sun City city_Sun Prairie city_Surprise city_Tega Cay city_Tempe city_Thornhill city_Tolleson city_Toronto city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Urbana city_Valley View city_Vaughan city_Verdun city_Verona city_Warrensville Heights city_Waunakee city_Waxhaw city_West Mifflin city_Westlake city_Westmount city_Wexford city_Whitby city_Willoughby city_Woodbridge city_Woodmere city_York categories_ Acai Bowls categories_ Active Life categories_ Adult Entertainment categories_ Afghan categories_ African categories_ Airports categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Shelters categories_ Antiques categories_ Appliances categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Galleries categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Automotive categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Barbeque categories_ Barbers categories_ Bars categories_ Basque categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Belgian categories_ Beverage Store categories_ Bistros categories_ Books categories_ Botanical Gardens categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ British categories_ Bubble Tea categories_ Buffets categories_ Burgers categories_ Butcher categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Canadian (New) categories_ Candy Stores categories_ Cantonese categories_ Car Wash categories_ Caribbean categories_ Casinos categories_ Caterers categories_ Cheese Shops categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chocolatiers & Shops categories_ Cinema categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee Roasteries categories_ Colombian categories_ Comfort Food categories_ Community Service/Non-Profit categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Country Dance Halls categories_ Creperies categories_ Cuban categories_ Cupcakes categories_ Custom Cakes categories_ Dance Clubs categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Department Stores categories_ Desserts categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Distilleries categories_ Dive Bars categories_ Do-It-Yourself Food categories_ Dominican categories_ Donairs categories_ Donuts categories_ Drugstores categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Empanadas categories_ Employment Agencies categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyelash Service categories_ Falafel categories_ Farmers Market categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Fish & Chips categories_ Fitness & Instruction categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ French categories_ Fruits & Veggies categories_ Furniture Stores categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ German categories_ Gift Shops categories_ Gluten-Free categories_ Golf categories_ Greek categories_ Grocery categories_ Guamanian categories_ Hair Removal categories_ Hair Salons categories_ Halal categories_ Hawaiian categories_ Health & Medical categories_ Health Markets categories_ Himalayan/Nepalese categories_ Home & Garden categories_ Home Decor categories_ Home Services categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Ice Cream & Frozen Yogurt categories_ Imported Food categories_ Indian categories_ Indonesian categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Italian categories_ Izakaya categories_ Japanese categories_ Jazz & Blues categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kitchen & Bath categories_ Korean categories_ Kosher categories_ Landmarks & Historical Buildings categories_ Laotian categories_ Latin American categories_ Lebanese categories_ Live/Raw Food categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Mags categories_ Malaysian categories_ Massage categories_ Meat Shops categories_ Mediterranean categories_ Mexican categories_ Middle Eastern categories_ Mini Golf categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Museums categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nightlife categories_ Noodles categories_ Nutritionists categories_ Organic Stores categories_ Outlet Stores categories_ Pakistani categories_ Pan Asian categories_ Party & Event Planning categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Chefs categories_ Personal Shopping categories_ Peruvian categories_ Pets categories_ Piano Bars categories_ Pizza categories_ Playgrounds categories_ Poke categories_ Polish categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Portuguese categories_ Poutineries categories_ Pretzels categories_ Professional Services categories_ Public Markets categories_ Public Services & Government categories_ Pubs categories_ Puerto Rican categories_ Ramen categories_ Real Estate categories_ Resorts categories_ Russian categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Seafood categories_ Shanghainese categories_ Shaved Ice categories_ Shaved Snow categories_ Shopping categories_ Singaporean categories_ Skin Care categories_ Smokehouse categories_ Social Clubs categories_ Soul Food categories_ Soup categories_ South African categories_ Southern categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Sporting Goods categories_ Sports Bars categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Vendors categories_ Sushi Bars categories_ Swimming Pools categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Taiwanese categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tea Rooms categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Tiki Bars categories_ Tobacco Shops categories_ Tours categories_ Towing categories_ Turkish categories_ Tuscan categories_ Ukrainian categories_ Vegan categories_ Vegetarian categories_ Venezuelan categories_ Venues & Event Spaces categories_ Vietnamese categories_ Waffles categories_ Waxing categories_ Wedding Planning categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wine & Spirits categories_ Wine Bars categories_ Wineries categories_ Wraps categories_ Zoos categories_Acai Bowls categories_Active Life categories_Adult Entertainment categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Amusement Parks categories_Arabian categories_Arcades categories_Argentine categories_Armenian categories_Art Galleries categories_Arts & Crafts categories_Arts & Entertainment categories_Asian Fusion categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Barbeque categories_Bars categories_Beauty & Spas categories_Belgian categories_Bistros categories_Books categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_British categories_Bubble Tea categories_Buffets categories_Burgers categories_Butcher categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Cambodian categories_Canadian (New) categories_Candy Stores categories_Cantonese categories_Car Wash categories_Caribbean categories_Casinos categories_Caterers categories_Cheese Shops categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Cinema categories_Cocktail Bars categories_Coffee & Tea categories_Coffee Roasteries categories_Comfort Food categories_Community Service/Non-Profit categories_Convenience Stores categories_Conveyor Belt Sushi categories_Cooking Classes categories_Creperies categories_Cuban categories_Cupcakes categories_Custom Cakes categories_Dance Clubs categories_Day Spas categories_Delicatessen categories_Delis categories_Department Stores categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Do-It-Yourself Food categories_Dominican categories_Donairs categories_Donuts categories_Drugstores categories_Education categories_Egyptian categories_Ethiopian categories_Ethnic Food categories_Ethnic Grocery categories_Event Planning & Services categories_Farmers Market categories_Fashion categories_Fast Food categories_Festivals categories_Filipino categories_Fish & Chips categories_Fitness & Instruction categories_Florists categories_Flowers & Gifts categories_Fondue categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Trucks categories_French categories_Fruits & Veggies categories_Furniture Stores categories_Gas Stations categories_Gastropubs categories_German categories_Gift Shops categories_Gluten-Free categories_Golf categories_Greek categories_Grocery categories_Hair Removal categories_Hair Salons categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Health Markets categories_Himalayan/Nepalese categories_Home & Garden categories_Home Services categories_Hong Kong Style Cafe categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Italian categories_Izakaya categories_Japanese categories_Jazz & Blues categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Kids Activities categories_Kitchen & Bath categories_Kombucha categories_Korean categories_Kosher categories_Landmarks & Historical Buildings categories_Latin American categories_Live/Raw Food categories_Local Flavor categories_Lounges categories_Macarons categories_Malaysian categories_Meat Shops categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Modern European categories_Mongolian categories_Moroccan categories_Music Venues categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Nutritionists categories_Other categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Party & Event Planning categories_Pasta Shops categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Personal Chefs categories_Personal Shopping categories_Peruvian categories_Pets categories_Pizza categories_Poke categories_Polish categories_Pool Halls categories_Portuguese categories_Poutineries categories_Pretzels categories_Professional Services categories_Public Markets categories_Public Services & Government categories_Pubs categories_Puerto Rican categories_Real Estate categories_Resorts categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Scandinavian categories_Seafood categories_Shanghainese categories_Shaved Ice categories_Shopping categories_Singaporean categories_Smokehouse categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Specialty Food categories_Sri Lankan categories_Steakhouses categories_Street Vendors categories_Sushi Bars categories_Szechuan categories_Tabletop Games categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Teppanyaki categories_Tex-Mex categories_Thai categories_Tiki Bars categories_Tobacco Shops categories_Tours categories_Turkish categories_Ukrainian categories_Vegan categories_Vegetarian categories_Venezuelan categories_Venues & Event Spaces categories_Vietnamese categories_Waffles categories_Wedding Chapels categories_Wholesale Stores categories_Wine & Spirits categories_Wineries categories_Wraps categories_Zoos
0 ---HLAnHbuLi7vd5TL6uYg zyp8SaRnZ94sWZpLrifS1Q l6xZVTEtdZAvNpL1JhYGuw 4 0 0 0 -1 0.595406 3.726715 3.704594 3.719363 -0.062602 -0.064146 -0.054028 1 2434.0 45.569980 -73.199634 4 5.0 45.0 41.0 45.0 44.0 42.0 40.0 45.0 59.0 61.0 60.0 71.0 71.0 70.0 61.0 5.000000 3.0 5.000000 3.0 5.000000 1.746082 0.009869 3.000000 10 0 0 2 0 0 22 3.0 3.000000 3.0 3.022921 2.723338 3.556721 3.79608 3.686094 3.777956 3.000000 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.000000 3.667520 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.022921 3.676024 3.788204 3.928912 3.867280 3.767263 0 1 0 0 1 0 0 1 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 ---L4b6VR6HoB-q7cfMWIA 697iJkhX1mkVF9RNhn114Q XiXu6WHbDoopKpeg7DfKdQ 5 3 1 2 1 0.993384 3.013889 3.014925 3.039444 -0.507850 -0.525615 -0.455618 1 4057.0 43.579150 -79.683305 51 3.5 15.0 14.0 17.0 16.0 14.0 14.0 16.0 59.0 61.0 60.0 67.0 36.0 35.0 61.0 3.521739 46.0 3.540541 37.0 3.495062 36.481444 0.009869 3.157143 236 5 25 809 331 162 3238 140.0 3.152672 131.0 3.158813 126.185462 3.833333 2.87500 4.000000 3.250000 3.013889 3.125000 3.333333 3.400000 2.666667 2.666667 3.823529 2.600000 4.000000 3.250000 3.014925 3.066667 3.333333 3.400000 2.400000 2.666667 3.821347 2.726907 4.000000 3.303025 3.039444 3.145065 2.847918 3.387395 2.572254 2.752899 0 0 1 0 1 0 1 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 ---sPYSgArT4Sd5v1nDVMQ iVSuN8PrtKVtLzhNiu23uA OumGHdbdp7WgyYMhcAdjhw 1 0 0 0 1 0.519254 3.726715 3.704594 3.719363 -0.062602 -0.064146 -0.054028 0 493.0 33.608745 -112.359880 190 2.5 29.0 26.0 30.0 29.0 27.0 25.0 27.0 59.0 61.0 60.0 67.0 71.0 70.0 61.0 2.821229 179.0 2.814286 140.0 2.817668 133.926571 0.009869 3.703313 1 0 0 0 0 0 4 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.667520 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 0 0 1 0 0 1 1 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 --0SzSMXVUoAXfackNoB4g v9P7J6hWWtIblnylQ5UBfA iCQpiavjjPzJ5_3gPD5Ebg 5 0 0 0 1 0.983368 1.000000 1.000000 1.000000 -3.139748 -3.132280 -3.125614 1 530.0 36.109837 -115.174212 4286 4.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 17.0 14.0 15.0 15.0 21.0 21.0 17.0 4.139748 3814.0 4.132280 3417.0 4.125614 3319.856699 0.009869 1.000000 73 0 1 41 17 32 9382 2.0 1.000000 1.0 1.000000 1.134894 3.556721 3.79608 3.686094 1.000000 1.000000 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 1.000000 3.667520 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 1.000000 1.000000 3.676024 3.788204 3.928912 3.867280 3.767263 1 0 0 0 0 1 1 0 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 --1JMhPk6K9fZo4FOp_yMw 2xZ1mHP14as5RJ1KOrVU4A QJatAcxYgK1Zp9BRZMAx7g 2 0 0 0 1 0.866956 2.750000 2.666667 2.748879 -1.212807 -1.267752 -1.191953 0 530.0 36.103061 -115.173450 2844 4.0 29.0 26.0 30.0 25.0 23.0 25.0 29.0 0.0 9.0 9.0 67.0 9.0 9.0 8.0 3.962807 2608.0 3.934419 2272.0 3.940832 2229.696159 0.009869 3.625000 548 5 10 273 87 46 1654 8.0 3.714286 7.0 3.621997 7.577900 3.556721 3.79608 4.000000 4.500000 5.000000 2.750000 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 4.000000 4.500000 5.000000 2.666667 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 4.000000 4.507847 5.000000 2.748879 3.788204 3.928912 3.867280 3.767263 0 0 1 0 0 1 1 0 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In [4]:
sub_train_set = train_set[:round(train_set.shape[0]/2)]
#sub_train_set = train_set
del train_set
sub_train_set.shape
Out[4]:
(279193, 817)
In [5]:
random_forest = _RandomForestClassifier(n_jobs = -1, random_state = 0)
random_forest.get_params()
Out[5]:
{'bootstrap': True,
 'class_weight': None,
 'criterion': 'gini',
 'max_depth': None,
 'max_features': 'auto',
 'max_leaf_nodes': None,
 'min_impurity_decrease': 0.0,
 'min_impurity_split': None,
 'min_samples_leaf': 1,
 'min_samples_split': 2,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 'warn',
 'n_jobs': -1,
 'oob_score': False,
 'random_state': 0,
 'verbose': 0,
 'warm_start': False}
In [6]:
# fine tune classifier

param_grid = {
    'bootstrap': [True, False],
    'max_depth': [10, 30, 50],
    'min_samples_leaf': [1, 2, 4],
    'min_samples_split': [2, 5, 10],
    'n_estimators': [200, 500, 1000],
    'criterion': ['gini', 'entropy']}
 
grid = _GridSearchCV(estimator=random_forest, param_grid=param_grid, refit=False, verbose=5, cv=3, error_score=_np.nan, n_jobs=-1, pre_dispatch=6)
grid.fit(sub_train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), sub_train_set['likes'])
Fitting 3 folds for each of 324 candidates, totalling 972 fits
[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.
[Parallel(n_jobs=-1)]: Done  12 tasks      | elapsed: 25.0min
[Parallel(n_jobs=-1)]: Done  66 tasks      | elapsed: 134.9min
[Parallel(n_jobs=-1)]: Done 156 tasks      | elapsed: 558.6min
[Parallel(n_jobs=-1)]: Done 282 tasks      | elapsed: 1123.2min
[Parallel(n_jobs=-1)]: Done 444 tasks      | elapsed: 2490.3min
[Parallel(n_jobs=-1)]: Done 642 tasks      | elapsed: 3490.7min
[Parallel(n_jobs=-1)]: Done 876 tasks      | elapsed: 4913.4min
[Parallel(n_jobs=-1)]: Done 972 out of 972 | elapsed: 5701.6min finished
Out[6]:
GridSearchCV(cv=3, error_score=nan,
             estimator=RandomForestClassifier(bootstrap=True, class_weight=None,
                                              criterion='gini', max_depth=None,
                                              max_features='auto',
                                              max_leaf_nodes=None,
                                              min_impurity_decrease=0.0,
                                              min_impurity_split=None,
                                              min_samples_leaf=1,
                                              min_samples_split=2,
                                              min_weight_fraction_leaf=0.0,
                                              n_estimators='warn', n_jobs=-1,
                                              oob_score=False, random_state=0,
                                              verbose=0, warm_start=False),
             iid='warn', n_jobs=-1,
             param_grid={'bootstrap': [True, False],
                         'criterion': ['gini', 'entropy'],
                         'max_depth': [10, 30, 50],
                         'min_samples_leaf': [1, 2, 4],
                         'min_samples_split': [2, 5, 10],
                         'n_estimators': [200, 500, 1000]},
             pre_dispatch=6, refit=False, return_train_score=False,
             scoring=None, verbose=5)
In [7]:
print("best params:\n", grid.best_params_)
print("best score:", grid.best_score_)
best params:
 {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}
best score: 0.7453481999906875
In [8]:
with open("../models/random_forest_params.txt", 'w') as f:
    f.write(str(grid.best_params_))
In [9]:
print("results:", grid.cv_results_)
results: {'mean_fit_time': array([ 252.87800407,  600.7321078 , 1147.74082383,  311.25228389,
        597.79628388, 1124.06511203,  307.2705942 ,  662.01461442,
       1102.80993398,  303.65359688,  676.63054156, 1121.13627799,
        290.6686426 ,  672.74625977, 1103.33752354,  299.24372005,
        666.57276177, 1089.51547178,  310.32110922,  677.31205535,
       1087.89945881,  302.18119884,  681.54606994, 1077.58835777,
        343.88937259,  647.70520091, 1116.92719674,  748.18958219,
       1670.8926479 , 3277.3076601 ,  726.21665557, 1612.23545631,
       3128.41236226,  706.48307268, 1692.42808255, 3044.78944604,
        710.94016171, 1687.92898154, 3029.70489526,  716.6063451 ,
       1661.93725832, 2975.88908982,  699.15533026, 1650.08494091,
       2936.43123945,  663.98700976, 1589.69072064, 2841.49037019,
        677.26351619, 1600.60654235, 2818.42968504,  689.23385302,
       1575.95244972, 2845.27126567,  854.03630026, 1924.91824579,
       3589.90236084,  806.29059982, 1901.73787904, 3542.07222215,
        755.210814  , 1842.24359202, 3356.92748769,  772.36362791,
       1820.98641507, 3347.10009464,  775.59133355, 1795.96330682,
       3266.85892995,  748.71085684, 1781.311143  , 3273.03043238,
        728.07022198, 1694.15878701, 3058.85896428,  699.67593479,
       1671.264992  , 2974.15439741,  733.93203028, 1685.07307522,
       2878.58820073,  316.39686712,  704.58548148, 1315.41525666,
        266.52052689,  617.80446243, 1116.14761202,  323.40379898,
        596.31457845, 1192.99234486,  306.07180023,  631.73588936,
       1107.42891963,  297.97278309,  667.9530731 , 1098.42698288,
        303.02261813,  669.46902156, 1087.44833231,  303.09010363,
        673.5451235 , 1114.85340317,  306.89094249,  687.86916415,
       1097.86382119,  297.0615526 ,  674.32869554, 1090.86054126,
        759.74303031, 1654.02773229, 3203.89591153,  739.06663569,
       1612.05327415, 3126.06397398,  717.29151392, 1664.62706677,
       2990.3717099 ,  713.38628578, 1674.57015268, 3052.98233859,
        706.96910882, 1662.88572113, 2994.57327501,  696.03201071,
       1641.71012592, 2944.10638984,  676.90547228, 1583.02720237,
       2877.37689845,  686.42369564, 1584.088492  , 2868.15209929,
        685.93400542, 1559.20022694, 2867.99252415,  829.33399868,
       1916.37242405, 2354.95233067,  793.30663912, 1802.85570685,
       2807.89743249,  762.03286791, 1283.57182018, 3463.58312575,
        731.26084558, 1854.44696649, 3406.09970792,  775.27916543,
       1776.74966931, 3378.01250974,  751.70151536, 1781.36533038,
       3312.62044843,  715.7409931 , 1679.43034275, 3086.08285077,
        707.93253167, 1689.3303593 , 3104.21390899,  733.89479637,
       1658.40256365, 3009.73295474,  387.77073312,  891.25081325,
       1669.05788708,  303.89761273,  792.31527678, 1507.48082161,
        375.99388091,  780.54043563, 1485.7558972 ,  384.29701686,
        844.19593382, 1433.76853601,  375.6255339 ,  863.93716296,
       1459.30460747,  373.0131824 ,  874.46136022, 1458.84884469,
        366.60730712,  847.71785371, 1464.13846231,  386.84738477,
        851.90233604, 1431.67779454,  376.84327952,  868.41485691,
       1490.45787962, 1037.9270467 , 2405.62106681, 4825.63027302,
       1033.44868628, 2339.84804352, 4577.08206526, 1021.59072081,
       2459.0623556 , 4526.76158404, 1010.90661224, 2410.19199936,
       4414.15627249,  985.28743219, 2401.53081822, 4428.08470591,
        977.47797553, 2381.53228196, 4365.35772228,  952.73910848,
       2311.57463264, 4176.69107254,  959.27862748, 2286.79719615,
       4208.51034379,  948.67896342, 2275.09348027, 4208.80089919,
       1235.81870461, 2858.33534185, 5524.50904346, 1157.73344286,
       2814.34227665, 5262.66301656, 1120.64758428, 2752.72798594,
       5050.97557322, 1114.87002667, 2681.47512833, 5052.0563302 ,
       1080.7848107 , 2693.51625339, 4898.46493554, 1077.82605847,
       2566.85985637, 4847.68583147, 1029.18784451, 2430.99676498,
       4506.637381  , 1028.56274716, 2459.17040197, 4529.50425267,
       1022.89356637, 2430.04791927, 4297.7135512 ,  366.22898634,
        874.93709048, 1706.81329211,  423.63784893,  806.94950231,
       1431.94374752,  387.1473972 ,  824.15550756, 1517.69511525,
        388.56926513,  841.67227936, 1471.32414134,  372.09363929,
        845.24180651, 1432.31375853,  381.16140246,  855.78428976,
       1459.6618127 ,  386.32925193,  868.99546496, 1440.26550515,
        388.21920252,  858.20249327, 1467.01532388,  370.12357601,
        863.297539  , 1474.60670225, 1016.42951632, 2406.92572657,
       4735.78016353, 1007.15231657, 2369.1623466 , 4589.57932417,
        996.3950723 , 2456.83763719, 4445.20094522, 1017.31780783,
       2411.83959389, 4458.39833736,  999.47517347, 2423.5928425 ,
       4455.27136358,  983.52447764, 2379.59146659, 4360.11706217,
        971.04184556, 2321.9312737 , 4224.82240629,  949.03002222,
       2311.84523551, 4232.07036241,  954.30591869, 2300.70933517,
       4256.19653241, 1227.57042034, 2862.16809591, 5481.17687869,
       1156.52833215, 2833.5944798 , 5331.72757514, 1125.75044847,
       2745.76227482, 5013.83818905, 1138.77612058, 2735.31786243,
       5126.30403113, 1114.51132131, 2684.64896623, 4955.564562  ,
       1108.3780485 , 2676.21811597, 4920.55986961, 1029.19738499,
       2504.03147769, 4686.9357365 , 1041.30135949, 2464.48220364,
       4656.1280938 , 1026.20438417, 2470.45756213, 2600.50956917]), 'std_fit_time': array([ 30.30829678,   7.3920137 ,   5.36162458,   6.16287781,
         4.71603199,  13.29157039,  30.77081708,  24.55500663,
         2.9462911 ,   8.21775426,  22.41828531,  17.96219934,
        14.42042926,   3.55636399,   3.40924981,  28.19064984,
        29.65923753,   9.68237659,  40.21162156,  26.81535584,
         2.73972701,  40.72857279,   5.32890289,   9.04349199,
        72.35007697,  33.76382866,  12.70825532,   1.1973388 ,
        26.28799272,  27.57452577,  19.2399905 ,  33.57233439,
         7.03525377,   6.96787114,  26.17840237,  21.69617215,
         5.72652778,  16.09784563,   5.60459622,  38.07846644,
         5.13522422,   6.10484057,  28.8840013 ,   9.15673828,
         3.77868333,  14.54373134,  22.62081667,  42.12137488,
        46.36612411,  48.95809511,   3.71756459,  10.25182197,
        38.77469533,   9.91279303,  15.02808539,   3.50673732,
        21.2591932 ,  33.73133942,  22.16604837,  27.53982584,
         6.55108126,  15.15851592,  29.02630199,  13.98807251,
        15.6170111 ,   5.27449468,   4.71669523,   4.94979628,
         8.21982903,  28.0828732 ,   6.64810862,  10.47285877,
        20.08594434,  21.47197092,  42.8724304 ,  13.53014429,
        26.3550897 ,  16.34642299,  23.8984927 ,  35.45706558,
         9.59644955,   5.7351674 ,  13.51277839,  18.27234975,
        31.81341328,   5.88774603,   4.85636529,  42.75257169,
        18.926562  ,  51.05243492,  14.43683902,  20.65098564,
        11.84923354,   5.44938135,  21.60465023,  11.25836454,
        24.45109409,  25.05591148,   8.4204074 ,  46.71867588,
        59.42065962,  16.06614626,   7.70801151,  17.29630675,
         4.24164931,  29.93605116,   9.75911225,  19.15552767,
        59.62827848,  13.51343739,  27.27137587,   6.4782706 ,
         3.12380134,  20.32793836,   6.62062667,  24.18534535,
        40.09660025,  49.38004415,  15.97957569,  26.38353192,
        14.78771054,  22.02305526,  21.94410114,  29.71031024,
        11.65456113,  14.25976303,  39.85181605,  10.32180083,
        37.968235  ,  33.46566647,  17.14615515,  33.27546592,
        13.19411005,  17.38057552,   6.11399812,  21.17612706,
        28.40688046,  51.34668928,  12.63107768,  62.71129915,
         0.36834374,   4.49424576,   4.33024111,   2.13752287,
         3.14354585,   2.83718275,   5.25167961,   1.73576136,
         1.3772604 ,   6.43414514,   5.18661129,   4.81210139,
         4.70039194,   3.52572648,   9.0589487 ,   3.79769031,
        17.04920476,  10.70438956,  15.14760422,  21.8990582 ,
         7.90116057,   7.4321548 ,  23.49232366,   8.12951472,
        11.25133362,   8.04122486,  22.22850683,   4.90010178,
        25.5742791 ,  12.17891482,   4.51435936,   8.93586555,
        15.36015864,   1.89146275,  18.4743522 ,  25.47284129,
        21.10220944,  26.36448207,  19.62003341,   0.5027232 ,
        20.74194503,   6.88672995,   5.34562602,  17.03257041,
        33.73822554,   6.77362179,   3.55545009,  27.0746333 ,
        18.11377592,  22.28935143,   8.96315801,  12.64294647,
         8.56096107,  12.50241136,  11.83272088,   7.44797562,
        56.87594953,  29.40598311,  10.79544944,  15.02910549,
        10.99092638,  20.06998157,  21.94652134,   3.90372069,
        16.31665264,  28.43944401,   2.07436933,  20.84868541,
         7.52533085,  20.92857139,   6.96924701,  12.79672038,
        17.67013584,  16.1245658 ,  15.55423661,  21.97211009,
        35.49759124,  30.27700544,   8.69162715,  12.27562495,
        22.06539704,  60.08812146,  14.55264842,  40.72960289,
        91.0545018 ,  11.58906563,   9.32032197,  30.75423376,
        10.83326925,  34.30339473, 112.39188405,  30.24681146,
         3.66167295,  31.26381741,  20.76093123,  12.31057523,
        39.19107482,  32.69729193,  14.48249089,  17.78150583,
        26.80245948,  13.89638986,  31.43113123,   6.77775749,
        14.62908714,  29.0209967 ,  26.23457069,  28.85436514,
        23.38462071,  20.3009723 ,   9.2764942 ,  39.71354787,
        18.69865738,  42.06633349,  19.12809447,  18.12269559,
        17.00438512,   2.51240561,  37.01248076,  17.14268623,
        20.883293  ,   1.64621857,   2.43930551,   1.74417196,
        51.9362573 ,  17.33219577,  38.648948  ,   0.62063819,
        14.20026948,   3.15050465,  29.67027009,  22.6174495 ,
        46.84971878,   4.36123567,  39.84663268,  17.87730277,
        12.26934706,  12.82897682,  29.01803997,  22.89018772,
        20.59199815,   1.00230024,   4.11090258,   7.74640375,
        28.20557553,  14.72147182,  25.71679854,  17.53483287,
        24.18787215,  18.48195342,   9.32786236,   3.19651885,
        20.79966976,  19.34164785,   5.23871617,  35.98229101,
         9.04758183,  17.21375522,   5.52598687,  12.11320323,
         3.80078647,   7.92999699,  12.25112275,   8.95235647,
         7.91797862, 233.30396124,  10.83876671,   5.30819944,
        16.55791144,  58.16967771,  15.74847618,  60.70317638,
         9.28805116,   3.96437891,   9.86981749,  26.12548248,
         7.69346146,  41.7562853 ,  15.55690244,  12.77329849,
         7.44098945,  35.39946121,  13.85279007, 266.83686164]), 'mean_score_time': array([ 22.74486287,  27.54935249,  40.42393446,  20.96096595,
        24.16673144,  33.61048714,  24.09159581,  21.4885548 ,
        27.27940639,  21.48090823,  23.90609399,  40.81389364,
        22.29373638,  25.46559215,  30.36516023,  21.57100264,
        21.2904187 ,  28.73950942,  20.99520882,  26.80766948,
        31.47818518,  22.31800421,  25.20029982,  34.49511989,
        20.40345565,  27.27109845,  47.96943299,  38.38139757,
        75.43567824, 142.1543235 ,  36.46684798,  84.20955404,
       117.19570931,  39.00838955,  66.25654753, 124.20829352,
        33.95057321,  71.15013234, 113.59234134,  38.96616999,
        67.43705702, 128.9812034 ,  32.80895972,  62.08403794,
       104.8068277 ,  33.91367467,  55.18680604, 103.22472239,
        33.0865519 ,  53.52890388,  91.599799  ,  34.16134564,
        52.56714249, 111.06709154,  43.28462021,  92.08084838,
       204.00564702,  43.39299862,  89.72115509, 156.64757824,
        38.93026106,  79.38644799, 154.51361744,  44.90229925,
        77.89443612, 134.62910597,  44.05755647,  76.60754387,
       148.97974277,  39.98810156,  76.71492179, 124.91507316,
        36.80760376,  66.35561681, 106.63028669,  39.01138043,
        64.5846839 , 104.55516775,  34.45987995,  64.06706834,
        98.58512616,  29.66336902,  32.21454899,  48.465108  ,
        22.88017066,  28.25613244,  41.74274349,  22.84559528,
        30.67034086,  43.45051614,  22.64645966,  24.40409335,
        35.75275524,  20.08098539,  22.798721  ,  31.6739943 ,
        22.74087501,  29.15207044,  38.58485301,  32.19127655,
        21.82200146,  29.69894179,  19.80871503,  21.07599092,
        30.05299664,  23.85722311,  27.23353243,  47.80221279,
        37.50374508,  83.11082554, 131.89508184,  37.60214726,
        80.8807865 ,  97.29390987,  37.50208131,  64.13488857,
       125.63415074,  36.20155899,  64.83169134, 103.95942688,
        37.98412506,  66.09464622, 121.3236746 ,  34.38740889,
        60.95438727, 103.46906996,  36.68692787,  53.5830938 ,
        98.62202724,  35.92562858,  56.79749886,  92.94353922,
        33.38409058,  54.9029003 , 105.8390673 ,  43.9564929 ,
        90.96881978,  74.26267052,  35.22018433,  81.99613849,
        69.53909079,  30.86548845,  19.91211637, 165.69738777,
        34.17664115,  79.06464195, 131.44495098,  34.8707815 ,
        48.8439579 , 159.30879815,  31.32160385,  59.16483959,
        77.00315078,  28.22920283,  38.79462775, 117.50687774,
        30.0519975 ,  51.31648493,  71.80637836,  32.39673042,
        36.52602466,  71.91708016,  26.34690038,  28.0852557 ,
        27.46225333,  20.06535864,  28.58791248,  39.13438503,
        25.13347801,  26.39776413,  33.74811896,  22.44765735,
        20.03377787,  36.5669144 ,  21.17771935,  28.27840439,
        32.7019105 ,  21.15378459,  23.29040798,  40.95878116,
        21.03643068,  26.94464   ,  31.16934427,  22.67404548,
        24.22856458,  24.4719158 ,  19.63318054,  24.31400243,
        55.21044834,  36.6776185 ,  81.42533151, 173.99586908,
        40.21050549,  90.37806646, 150.76397332,  39.94188897,
        73.09294057, 138.47282974,  40.28896332,  73.73821322,
       131.70758104,  38.85878762,  80.82327445, 140.61676288,
        39.93557556,  67.3748912 , 116.77915653,  37.2760191 ,
        69.67506965, 117.97761822,  37.90533606,  68.95566368,
       120.58597914,  36.66299057,  68.35793169, 111.96735199,
        52.30983059, 115.45236993, 249.59377821,  52.13097493,
       109.56909823, 223.20033431,  49.86469928, 114.73362462,
       175.27345332,  40.55359038,  94.66859619, 207.41952109,
        45.88666828, 105.61533038, 179.65640362,  40.31389523,
        87.28633006, 147.3414553 ,  38.84282859,  76.10887901,
       143.36608315,  40.13105273,  77.18898861, 119.65613238,
        41.25371949,  75.5912625 , 127.92535814,  23.59692009,
        31.74813008,  48.87002476,  26.67868336,  36.05129306,
        45.2753044 ,  25.54803848,  30.76608968,  38.37342095,
        21.76381874,  27.10653933,  34.0479823 ,  22.49087803,
        26.45361725,  35.12310855,  22.54938698,  27.99050943,
        35.33287899,  17.09662922,  19.62021629,  25.16173673,
        21.79473607,  26.04038612,  35.8611358 ,  18.33332284,
        24.99484897,  40.30558395,  39.98743526,  81.91469073,
       161.06310924,  40.43723281,  81.71322838, 137.18660172,
        39.01636712,  74.91539947, 129.77674564,  34.74910665,
        70.54807472, 141.5206809 ,  39.8634367 ,  73.80270783,
       130.4791983 ,  36.25807516,  65.96432996, 120.82633654,
        36.08719714,  62.1285855 , 123.06800938,  33.53169688,
        58.39689056, 111.50492374,  33.91101575,  61.79248222,
       122.0011967 ,  45.3614041 , 116.23760295, 242.4129866 ,
        48.1316669 , 115.09100095, 232.28936831,  46.3434461 ,
       129.92767119, 185.7609218 ,  42.03429993, 101.12400738,
       192.25539096,  46.57715472, 102.40990162, 170.98325523,
        36.91997067,  87.0579412 , 145.6981825 ,  38.12441715,
        75.73321358, 130.70625877,  38.28798119,  66.29643885,
       118.93605439,  43.10410603,  74.20729534,  34.68228738]), 'std_score_time': array([ 3.73790697,  0.30415656,  6.92587549,  2.80850105,  1.15761239,
       11.03193416,  2.18269738,  5.7843071 ,  6.68902138,  4.04390138,
        6.90222302,  7.87328813,  4.49783489,  2.72922677,  7.98650193,
        1.52263762,  5.5064888 ,  3.48344337,  0.47871017,  3.73582713,
        4.34613959,  2.36834954,  1.46729828,  5.89990151,  1.61809314,
        3.84808417,  5.0427848 ,  2.30684875,  4.81018346, 10.27616733,
        2.74101451,  3.79907696,  3.98529257,  4.40175377,  2.61252675,
        5.53625533,  3.37931338,  1.65719739,  1.86480187,  0.65404252,
        5.25289516,  4.36917093,  3.74452769,  4.16126799,  5.23263841,
        3.13796786,  7.31170198,  7.82135388,  1.95445351,  4.38168262,
       10.20791492,  3.05717731,  3.1417052 ,  6.26757503,  4.58113999,
       11.0739803 ,  6.86371335,  4.36039996,  2.19959567,  1.47891058,
        2.45664485,  3.11531965,  1.32099943,  4.53794374,  8.37548576,
        9.20223396,  4.9506376 ,  1.53459068,  0.28753037,  2.98474871,
        3.14313634,  1.19331635,  2.89314709,  0.77832221,  4.2531609 ,
        2.10041339,  4.41834301,  5.39228948,  3.26462031,  2.35477828,
        3.51190062,  9.50254451,  0.72167435,  1.73167494,  2.07519971,
        4.71073193,  2.34675829,  2.15203514,  3.67366729,  6.62659062,
        2.37555739,  2.51017608,  7.64574316,  2.75829883,  0.63760264,
        4.81559447,  1.73884145,  4.53266327,  5.38363662, 10.35304377,
        4.83179403,  7.91551107,  1.98318509,  5.88545707,  2.44557327,
        1.27360577,  3.45900966,  5.74310212,  1.51303855,  3.9097375 ,
       10.87698257,  4.39227344,  8.45290361,  9.14867945,  5.90746222,
        5.87062798,  5.80196265,  2.26114274,  0.93526402,  6.58166393,
        0.46877206,  6.95247138, 12.93359514,  1.36948163,  4.16947603,
        5.69149097,  1.43797017,  9.73990258,  2.87744378,  3.04764653,
        5.66657796,  6.6383046 ,  1.88310248,  4.53918743, 14.56719218,
        4.5464202 ,  3.90785792, 16.59522595,  5.77814509, 13.45240866,
        6.98776699,  3.54062851,  0.12516608,  0.24607868,  0.66054981,
        5.3054573 ,  2.77873505,  1.25965318,  4.11532532,  2.06323342,
        1.99657908,  1.66333947,  4.66710504,  0.55396838,  6.17189624,
        3.75059384,  3.96108726,  4.51974468, 19.41740517,  3.72227094,
       15.4103076 , 12.85118308,  2.91415565,  2.36420569,  7.39524787,
        1.42865041,  7.54756348,  5.08740656,  4.49961882,  2.20533559,
       12.30644421,  2.95896234,  8.56015294,  9.45873904,  0.76678208,
        0.8716595 ,  6.08612062,  2.98883532,  6.7323311 ,  3.27947057,
        0.14285844,  4.45707189,  6.15440368,  1.70068028,  4.83515932,
        5.03018124,  2.97086614,  5.4358429 ,  2.0888408 ,  3.05860615,
        7.75705428,  4.24975645,  4.34786636,  4.31404968,  4.11678185,
        1.75035116,  3.61430526,  0.45768793,  7.25933701,  2.53974597,
        9.3456339 ,  2.07941927,  3.03685232,  7.37798915,  4.90112696,
        6.08297988,  5.68544687,  3.41365848,  5.19504503,  6.27730193,
        4.02094883,  2.10016744, 10.31717127,  2.40784982,  3.76999755,
       10.16835344,  3.02120987,  1.6133806 , 17.22251422,  5.18892895,
        4.67723033, 13.7662241 ,  7.40142595, 13.31781516,  9.85166008,
        5.56724033,  7.27027481, 21.16881461,  4.52884724, 20.45550601,
        4.03839222,  3.6416739 ,  3.15661309,  7.46747975,  3.01372365,
        5.64001772,  3.40005478,  3.07717959,  3.0524081 ,  5.48947086,
        2.04908638,  3.11329794,  3.7344732 ,  0.97981083,  2.09441675,
        4.67694722,  1.61361817,  5.13391137,  2.10023593,  0.77977137,
        1.35841737,  6.64283177,  1.23429067,  2.70003199,  6.35901372,
        2.15777896,  6.75359925,  6.36054868,  1.70093369,  3.16029066,
        5.18878811,  0.54288971,  0.31328513,  2.14584227,  2.06049559,
        6.19079603,  8.08457458,  1.53124759,  3.72749796,  1.61386877,
        4.85124001,  2.60318048, 21.41306405,  0.44808795,  1.72836091,
        1.57733826,  1.68303042,  7.2501898 ,  7.81201401,  3.84858105,
        7.92796487,  4.47742288,  2.39859737,  2.52222502,  6.60925813,
        3.56953739,  5.61733233,  1.80074324,  2.59539304,  6.54945999,
        9.54982783,  5.54282533,  8.48413645,  6.77137514,  2.65786256,
        6.28151036,  6.16077712,  3.38522252,  7.16557433, 15.11352743,
        0.61040563, 16.13892246,  9.48657747,  9.43038419, 25.17788277,
        0.63656661,  2.00818815, 14.88494676, 13.4548406 ,  0.60568569,
       17.64370399,  5.56989484,  5.03281369,  2.0426668 ,  5.30472464,
        0.98666712,  5.50817538, 17.00630192,  1.30159059, 11.35127102,
        5.20120321,  2.80094782,  5.02332418, 16.61052917]), 'param_bootstrap': masked_array(data=[True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   True, True, True, True, True, True, True, True, True,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False],
             mask=[False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False],
       fill_value='?',
            dtype=object), 'param_criterion': masked_array(data=['gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'gini', 'gini', 'gini', 'gini',
                   'gini', 'gini', 'gini', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy', 'entropy',
                   'entropy', 'entropy', 'entropy', 'entropy'],
             mask=[False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False],
       fill_value='?',
            dtype=object), 'param_max_depth': masked_array(data=[10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10,
                   10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 30,
                   30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30,
                   30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 50, 50,
                   50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50,
                   50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 10, 10, 10,
                   10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10,
                   10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 30, 30, 30, 30,
                   30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30,
                   30, 30, 30, 30, 30, 30, 30, 30, 30, 50, 50, 50, 50, 50,
                   50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50,
                   50, 50, 50, 50, 50, 50, 50, 50, 10, 10, 10, 10, 10, 10,
                   10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10,
                   10, 10, 10, 10, 10, 10, 10, 30, 30, 30, 30, 30, 30, 30,
                   30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30,
                   30, 30, 30, 30, 30, 30, 50, 50, 50, 50, 50, 50, 50, 50,
                   50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50,
                   50, 50, 50, 50, 50, 10, 10, 10, 10, 10, 10, 10, 10, 10,
                   10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10,
                   10, 10, 10, 10, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30,
                   30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30,
                   30, 30, 30, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50,
                   50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50,
                   50, 50],
             mask=[False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False],
       fill_value='?',
            dtype=object), 'param_min_samples_leaf': masked_array(data=[1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2,
                   4, 4, 4, 4, 4, 4, 4, 4, 4, 1, 1, 1, 1, 1, 1, 1, 1, 1,
                   2, 2, 2, 2, 2, 2, 2, 2, 2, 4, 4, 4, 4, 4, 4, 4, 4, 4,
                   1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2,
                   4, 4, 4, 4, 4, 4, 4, 4, 4, 1, 1, 1, 1, 1, 1, 1, 1, 1,
                   2, 2, 2, 2, 2, 2, 2, 2, 2, 4, 4, 4, 4, 4, 4, 4, 4, 4,
                   1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2,
                   4, 4, 4, 4, 4, 4, 4, 4, 4, 1, 1, 1, 1, 1, 1, 1, 1, 1,
                   2, 2, 2, 2, 2, 2, 2, 2, 2, 4, 4, 4, 4, 4, 4, 4, 4, 4,
                   1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2,
                   4, 4, 4, 4, 4, 4, 4, 4, 4, 1, 1, 1, 1, 1, 1, 1, 1, 1,
                   2, 2, 2, 2, 2, 2, 2, 2, 2, 4, 4, 4, 4, 4, 4, 4, 4, 4,
                   1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2,
                   4, 4, 4, 4, 4, 4, 4, 4, 4, 1, 1, 1, 1, 1, 1, 1, 1, 1,
                   2, 2, 2, 2, 2, 2, 2, 2, 2, 4, 4, 4, 4, 4, 4, 4, 4, 4,
                   1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2,
                   4, 4, 4, 4, 4, 4, 4, 4, 4, 1, 1, 1, 1, 1, 1, 1, 1, 1,
                   2, 2, 2, 2, 2, 2, 2, 2, 2, 4, 4, 4, 4, 4, 4, 4, 4, 4],
             mask=[False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False],
       fill_value='?',
            dtype=object), 'param_min_samples_split': masked_array(data=[2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10,
                   10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10,
                   10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5,
                   10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5,
                   5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2,
                   2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10,
                   2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10,
                   10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10,
                   10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5,
                   10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5,
                   5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2,
                   2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10,
                   2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10,
                   10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10,
                   10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5,
                   10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5,
                   5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2,
                   2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10, 10,
                   2, 2, 2, 5, 5, 5, 10, 10, 10, 2, 2, 2, 5, 5, 5, 10, 10,
                   10, 2, 2, 2, 5, 5, 5, 10, 10, 10],
             mask=[False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False],
       fill_value='?',
            dtype=object), 'param_n_estimators': masked_array(data=[200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000, 200, 500, 1000, 200, 500, 1000,
                   200, 500, 1000, 200, 500, 1000, 200, 500, 1000, 200,
                   500, 1000, 200, 500, 1000, 200, 500, 1000, 200, 500,
                   1000, 200, 500, 1000],
             mask=[False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False, False, False, False, False,
                   False, False, False, False],
       fill_value='?',
            dtype=object), 'params': [{'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': True, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'gini', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 10, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 30, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 1, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 2, 'min_samples_split': 10, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 2, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 5, 'n_estimators': 1000}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 200}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 500}, {'bootstrap': False, 'criterion': 'entropy', 'max_depth': 50, 'min_samples_leaf': 4, 'min_samples_split': 10, 'n_estimators': 1000}], 'split0_test_score': array([0.7255789 , 0.72637404, 0.72668565, 0.72754526, 0.72776017,
       0.72755601, 0.72722291, 0.7275775 , 0.72708322, 0.72700801,
       0.72706173, 0.72741632, 0.72811476, 0.72773868, 0.72743781,
       0.72637404, 0.72760974, 0.72677161, 0.72677161, 0.72709397,
       0.72734111, 0.72677161, 0.72709397, 0.72734111, 0.72747005,
       0.72717993, 0.72705099, 0.74254553, 0.74325471, 0.74348036,
       0.74353409, 0.7445119 , 0.74443668, 0.74258851, 0.74338366,
       0.74428625, 0.7436308 , 0.74355558, 0.74370601, 0.74313652,
       0.7435126 , 0.74324397, 0.74266373, 0.74343738, 0.74345887,
       0.74189008, 0.74271746, 0.74257777, 0.74189008, 0.74271746,
       0.74257777, 0.74314726, 0.74281416, 0.7425133 , 0.73950465,
       0.74107344, 0.74104121, 0.74297534, 0.74328695, 0.74345887,
       0.74382421, 0.74403911, 0.74371676, 0.74382421, 0.74426476,
       0.7440606 , 0.74291087, 0.74334068, 0.7441788 , 0.74377048,
       0.74422178, 0.74432923, 0.74229839, 0.74301832, 0.74309354,
       0.74229839, 0.74301832, 0.74309354, 0.74195455, 0.74286789,
       0.74316875, 0.72595498, 0.72585827, 0.72609467, 0.72626659,
       0.72676087, 0.72707248, 0.72693279, 0.72685757, 0.72738409,
       0.72606243, 0.72683608, 0.72708322, 0.72723365, 0.72669639,
       0.72649224, 0.7271262 , 0.72686832, 0.72632031, 0.72625584,
       0.72662118, 0.72655671, 0.72625584, 0.72662118, 0.72655671,
       0.72729812, 0.72722291, 0.72717993, 0.74312577, 0.74346962,
       0.74421104, 0.74350185, 0.74446892, 0.74389943, 0.74360931,
       0.74402837, 0.74436147, 0.74335142, 0.74423253, 0.74402837,
       0.74321173, 0.74377048, 0.7439639 , 0.74278193, 0.74352334,
       0.74373825, 0.74258851, 0.7432977 , 0.74295385, 0.74258851,
       0.7432977 , 0.74295385, 0.74233063, 0.7427282 , 0.74221243,
       0.74092301, 0.74183635, 0.74147101, 0.74344813, 0.74379197,
       0.74409284, 0.74319024, 0.74450116, 0.74453339, 0.74398539,
       0.7446301 , 0.74455488, 0.74360931, 0.74421104, 0.74453339,
       0.74437221, 0.74428625, 0.74466233, 0.74369527, 0.74356632,
       0.74364154, 0.74369527, 0.74356632, 0.74364154, 0.74237361,
       0.74278193, 0.7432977 , 0.72561113, 0.72627733, 0.72640628,
       0.72668565, 0.72755601, 0.72781389, 0.72641702, 0.72722291,
       0.72786762, 0.72780315, 0.7279106 , 0.72720142, 0.72744856,
       0.72786762, 0.72772793, 0.72765272, 0.72738409, 0.72723365,
       0.72539623, 0.72647075, 0.72681459, 0.72539623, 0.72647075,
       0.72681459, 0.72693279, 0.72687906, 0.72706173, 0.74231988,
       0.74274969, 0.74298608, 0.74413582, 0.74435072, 0.74531779,
       0.74442594, 0.74485575, 0.74507065, 0.74383495, 0.74430774,
       0.7447268 , 0.74447966, 0.74445817, 0.74413582, 0.74454414,
       0.74508139, 0.74464084, 0.74461935, 0.74414656, 0.74442594,
       0.74461935, 0.74414656, 0.74442594, 0.74368452, 0.74397464,
       0.74385644, 0.73575458, 0.73650674, 0.73667867, 0.74036426,
       0.74130984, 0.74112717, 0.74273895, 0.74393166, 0.74428625,
       0.74308279, 0.74377048, 0.74414656, 0.74386719, 0.74397464,
       0.74480202, 0.74418954, 0.74473755, 0.74471606, 0.74314726,
       0.74371676, 0.74435072, 0.74314726, 0.74371676, 0.74435072,
       0.74356632, 0.74381346, 0.74382421, 0.7261269 , 0.72637404,
       0.7262451 , 0.72540697, 0.72592274, 0.72630957, 0.72733036,
       0.72737334, 0.72719067, 0.72656745, 0.72627733, 0.72647075,
       0.72704024, 0.72697577, 0.72649224, 0.7260087 , 0.72614839,
       0.72684683, 0.72616988, 0.72552517, 0.72538548, 0.72616988,
       0.72552517, 0.72538548, 0.7256756 , 0.72601945, 0.72556815,
       0.74424327, 0.74490947, 0.74531779, 0.74408209, 0.74478053,
       0.74447966, 0.74449041, 0.74510289, 0.74497394, 0.74433998,
       0.74467308, 0.7445119 , 0.74428625, 0.74482351, 0.74498469,
       0.74398539, 0.74474829, 0.74467308, 0.74388868, 0.74435072,
       0.74440445, 0.74388868, 0.74435072, 0.74440445, 0.74342664,
       0.74378123, 0.74403911, 0.73806479, 0.73862354, 0.73877398,
       0.7412776 , 0.74209424, 0.74200827, 0.74402837, 0.74481277,
       0.74499543, 0.74411433, 0.74511363, 0.74488798, 0.74398539,
       0.74455488, 0.74468382, 0.74450116, 0.74474829, 0.74525332,
       0.74395315, 0.74415731, 0.7443937 , 0.74395315, 0.74415731,
       0.7443937 , 0.74433998, 0.7443937 , 0.7445119 ]), 'split1_test_score': array([0.72834823, 0.72874581, 0.72863836, 0.72729519, 0.72724147,
       0.72802587, 0.72823004, 0.72874581, 0.7290037 , 0.72934755,
       0.72885326, 0.72855239, 0.72758532, 0.72780022, 0.72826227,
       0.72752085, 0.72852016, 0.72836972, 0.72832674, 0.72823004,
       0.72824078, 0.72832674, 0.72823004, 0.72824078, 0.72804737,
       0.72838047, 0.72841271, 0.74431574, 0.74504642, 0.74479928,
       0.74401487, 0.744853  , 0.74535803, 0.74453065, 0.74603499,
       0.74528282, 0.74391816, 0.74492822, 0.74474555, 0.74422978,
       0.74468108, 0.74473481, 0.74359581, 0.74430499, 0.74448766,
       0.74249979, 0.74331643, 0.74355282, 0.74249979, 0.74331643,
       0.74355282, 0.74325196, 0.74369251, 0.74411158, 0.74076979,
       0.7416724 , 0.74202699, 0.74334866, 0.74388593, 0.74426201,
       0.7442835 , 0.74559443, 0.745444  , 0.74402562, 0.74510015,
       0.744853  , 0.74323047, 0.74459512, 0.74487449, 0.74501418,
       0.74571263, 0.74521834, 0.74384295, 0.7440686 , 0.74416531,
       0.74384295, 0.7440686 , 0.74416531, 0.74306929, 0.74470257,
       0.74482077, 0.72860612, 0.72799364, 0.7278432 , 0.72753159,
       0.72789693, 0.72811184, 0.72833749, 0.7286491 , 0.72821929,
       0.72793991, 0.72769277, 0.72778948, 0.72668271, 0.72713402,
       0.72753159, 0.72789693, 0.72787544, 0.72793991, 0.72858463,
       0.72785395, 0.72780022, 0.72858463, 0.72785395, 0.72780022,
       0.72737041, 0.72679017, 0.72747786, 0.74384295, 0.74472406,
       0.74511089, 0.74436947, 0.7446381 , 0.74519685, 0.74517536,
       0.74594902, 0.74578784, 0.74397189, 0.74448766, 0.74523983,
       0.74494971, 0.74573412, 0.74548698, 0.74362804, 0.74488524,
       0.74460586, 0.7438322 , 0.74389667, 0.74416531, 0.7438322 ,
       0.74389667, 0.74416531, 0.74360655, 0.74425127, 0.74405785,
       0.74127482, 0.74256426, 0.74258575, 0.7431445 , 0.74410083,
       0.74464884, 0.74384295, 0.74533654, 0.74517536, 0.7441868 ,
       0.74459512, 0.74483151, 0.74435872, 0.74518611, 0.74563741,
       0.74486375, 0.74585232, 0.74543325, 0.74403636, 0.74450915,
       0.74443394, 0.74403636, 0.74450915, 0.74443394, 0.7445199 ,
       0.74460586, 0.74490673, 0.72608098, 0.72729519, 0.72764979,
       0.72743488, 0.72863836, 0.72881028, 0.72886401, 0.72873506,
       0.72850941, 0.72886401, 0.72862761, 0.72826227, 0.72925084,
       0.72894997, 0.72891773, 0.72894997, 0.72840196, 0.72829451,
       0.72858463, 0.72846643, 0.72849867, 0.72858463, 0.72846643,
       0.72849867, 0.72825153, 0.72869208, 0.72885326, 0.743714  ,
       0.74375698, 0.74407934, 0.74465959, 0.74498195, 0.74503567,
       0.74559443, 0.74612095, 0.74624989, 0.74598126, 0.74555145,
       0.74552996, 0.74493897, 0.74592753, 0.74593828, 0.74523983,
       0.74586306, 0.74580934, 0.74445543, 0.74516462, 0.74559443,
       0.74445543, 0.74516462, 0.74559443, 0.74488524, 0.74519685,
       0.74488524, 0.73696596, 0.73761068, 0.73795453, 0.74156494,
       0.74154345, 0.74241382, 0.74421903, 0.74474555, 0.74457363,
       0.74416531, 0.74489599, 0.7446381 , 0.74414382, 0.7445199 ,
       0.74494971, 0.74516462, 0.74575561, 0.74588455, 0.74534729,
       0.74569114, 0.74583083, 0.74534729, 0.74569114, 0.74583083,
       0.74484226, 0.74535803, 0.745444  , 0.72806886, 0.72780022,
       0.72818705, 0.72793991, 0.7276283 , 0.72811184, 0.72763904,
       0.7278432 , 0.72766053, 0.72813333, 0.72744563, 0.72778948,
       0.72767128, 0.7281978 , 0.72813333, 0.72754234, 0.72757457,
       0.72785395, 0.72988481, 0.72869208, 0.728316  , 0.72988481,
       0.72869208, 0.728316  , 0.72874581, 0.72789693, 0.72797215,
       0.74358506, 0.74496046, 0.74512164, 0.74533654, 0.74557294,
       0.74537952, 0.74579859, 0.74579859, 0.74591679, 0.74457363,
       0.74549772, 0.74591679, 0.74462735, 0.7453258 , 0.74561592,
       0.74570188, 0.74629287, 0.74629287, 0.74533654, 0.74568039,
       0.74575561, 0.74533654, 0.74568039, 0.74575561, 0.74468108,
       0.74545474, 0.74543325, 0.73828763, 0.73912576, 0.73907204,
       0.74277916, 0.74275767, 0.74270395, 0.7442835 , 0.74377848,
       0.74460586, 0.74443394, 0.74519685, 0.74522909, 0.74436947,
       0.74551921, 0.7453258 , 0.74586306, 0.74607797, 0.74643256,
       0.7449712 , 0.74572337, 0.74557294, 0.7449712 , 0.74572337,
       0.74557294, 0.745444  , 0.74559443, 0.74573412]), 'split2_test_score': array([0.72668271, 0.72715551, 0.72720923, 0.72636035, 0.72686538,
       0.72681166, 0.72619917, 0.72626365, 0.72685464, 0.72698358,
       0.72719849, 0.72766053, 0.72744563, 0.7273919 , 0.72738116,
       0.72659675, 0.72719849, 0.72748861, 0.7258016 , 0.72651079,
       0.72679017, 0.7258016 , 0.72651079, 0.72679017, 0.72710178,
       0.72752085, 0.72766053, 0.74118886, 0.74203774, 0.74203774,
       0.74144675, 0.74197327, 0.74240308, 0.74196252, 0.74196252,
       0.74183358, 0.74169389, 0.7417906 , 0.74175836, 0.74080203,
       0.74184432, 0.74175836, 0.74109215, 0.74229562, 0.74242457,
       0.74152196, 0.74102768, 0.7413178 , 0.74152196, 0.74102768,
       0.7413178 , 0.74059787, 0.74069458, 0.74093097, 0.73951259,
       0.73973824, 0.74002837, 0.74162942, 0.74274693, 0.74276842,
       0.74300481, 0.74363879, 0.74331643, 0.74199476, 0.74290811,
       0.74309078, 0.74197327, 0.74312301, 0.74294034, 0.74218817,
       0.74247829, 0.74263947, 0.74014657, 0.74114588, 0.74135004,
       0.74014657, 0.74114588, 0.74135004, 0.74057638, 0.74104917,
       0.74135004, 0.72622067, 0.72706954, 0.72713402, 0.72711252,
       0.72622067, 0.72644632, 0.72565116, 0.72559744, 0.72604874,
       0.7261347 , 0.72628514, 0.72614545, 0.72550073, 0.72574787,
       0.72583383, 0.72620992, 0.72688687, 0.72658601, 0.72617768,
       0.72662899, 0.72651079, 0.72617768, 0.72662899, 0.72651079,
       0.72653228, 0.72639259, 0.72662899, 0.74183358, 0.74253202,
       0.74271469, 0.74209146, 0.7430263 , 0.74255351, 0.74217743,
       0.74261798, 0.74246755, 0.74214519, 0.74239233, 0.74254277,
       0.74223115, 0.74255351, 0.74259649, 0.74147898, 0.74197327,
       0.74220966, 0.74125333, 0.74177985, 0.74148973, 0.74125333,
       0.74177985, 0.74148973, 0.74108141, 0.74082352, 0.74084501,
       0.74010358, 0.74084501, 0.74129631, 0.74223115, 0.74325196,
       0.74332717, 0.74364953, 0.74374624, 0.74356357, 0.7430263 ,
       0.74350984, 0.74398264, 0.7432627 , 0.74298332, 0.74356357,
       0.7430263 , 0.74364953, 0.74387518, 0.74174761, 0.74191954,
       0.74208072, 0.74174761, 0.74191954, 0.74208072, 0.74230637,
       0.74227413, 0.74220966, 0.72655377, 0.7272737 , 0.72686538,
       0.72675793, 0.72706954, 0.72701582, 0.72708029, 0.7276283 ,
       0.72713402, 0.7267042 , 0.72675793, 0.72689762, 0.72579085,
       0.72654302, 0.72655377, 0.72574787, 0.72754234, 0.72738116,
       0.72723072, 0.7270588 , 0.72704805, 0.72723072, 0.7270588 ,
       0.72704805, 0.72770352, 0.72715551, 0.72766053, 0.74248904,
       0.74267171, 0.74262873, 0.74297258, 0.7433809 , 0.74327345,
       0.74242457, 0.74346686, 0.74345612, 0.74260724, 0.74330568,
       0.74334866, 0.74299407, 0.74374624, 0.74347761, 0.74374624,
       0.74332717, 0.74324121, 0.74214519, 0.74261798, 0.74228488,
       0.74214519, 0.74261798, 0.74228488, 0.74219892, 0.742575  ,
       0.7428114 , 0.73552609, 0.73620304, 0.73611708, 0.74008209,
       0.74052265, 0.74054414, 0.74209146, 0.74316599, 0.74329494,
       0.74259649, 0.74312301, 0.74319823, 0.74255351, 0.74299407,
       0.74300481, 0.74401487, 0.743714  , 0.74401487, 0.74195178,
       0.7429296 , 0.743714  , 0.74195178, 0.7429296 , 0.743714  ,
       0.74268246, 0.74343463, 0.74327345, 0.72652153, 0.72679017,
       0.72702656, 0.72617768, 0.72666122, 0.72674718, 0.72657526,
       0.72641408, 0.72638184, 0.72585533, 0.72581234, 0.72607023,
       0.72620992, 0.72683315, 0.72683315, 0.72586607, 0.72663973,
       0.72672569, 0.72456589, 0.72653228, 0.72657526, 0.72456589,
       0.72653228, 0.72657526, 0.72560818, 0.72596278, 0.72630663,
       0.74201625, 0.74261798, 0.74280065, 0.74347761, 0.74378922,
       0.74391816, 0.74321972, 0.74374624, 0.74353133, 0.74392891,
       0.74392891, 0.74364953, 0.74306929, 0.74397189, 0.74355282,
       0.74263947, 0.74343463, 0.74368177, 0.74294034, 0.74270395,
       0.74315525, 0.74294034, 0.74270395, 0.74315525, 0.74215594,
       0.74266096, 0.74282214, 0.73719161, 0.73719161, 0.73727757,
       0.74097395, 0.74206997, 0.74204848, 0.74327345, 0.74345612,
       0.74373549, 0.74272544, 0.74309078, 0.74320897, 0.74283289,
       0.74341314, 0.74367102, 0.74385369, 0.74425127, 0.74435872,
       0.74270395, 0.74377848, 0.74404711, 0.74270395, 0.74377848,
       0.74404711, 0.74345612, 0.74420829, 0.74411158]), 'mean_test_score': array([0.72686994, 0.72742511, 0.72751108, 0.72706694, 0.72728901,
       0.72746451, 0.72721737, 0.72752899, 0.72764718, 0.72777971,
       0.72770449, 0.72787642, 0.72771524, 0.7276436 , 0.72769375,
       0.72683054, 0.72777613, 0.72754331, 0.72696665, 0.72727826,
       0.72745735, 0.72696665, 0.72727826, 0.72745735, 0.72753973,
       0.72769375, 0.72770807, 0.74268338, 0.74344629, 0.74343913,
       0.74299857, 0.74377939, 0.74406593, 0.74302722, 0.74379372,
       0.74380088, 0.74308095, 0.7434248 , 0.74340331, 0.74272278,
       0.743346  , 0.74324571, 0.74245056, 0.743346  , 0.74345704,
       0.74197061, 0.74235386, 0.7424828 , 0.74197061, 0.74235386,
       0.7424828 , 0.74233237, 0.74240042, 0.74251862, 0.73992901,
       0.74082803, 0.74103219, 0.74265114, 0.7433066 , 0.74349643,
       0.74370418, 0.74442411, 0.74415906, 0.74328153, 0.74409101,
       0.74400146, 0.74270487, 0.74368627, 0.74399788, 0.74365761,
       0.74413757, 0.74406235, 0.74209597, 0.74274427, 0.74286963,
       0.74209597, 0.74274427, 0.74286963, 0.74186674, 0.74287321,
       0.74311319, 0.72692725, 0.72697381, 0.72702396, 0.72697023,
       0.72695949, 0.72721021, 0.72697381, 0.7270347 , 0.72721737,
       0.72671235, 0.726938  , 0.72700605, 0.72647237, 0.72652609,
       0.72661922, 0.72707768, 0.72721021, 0.72694874, 0.72700605,
       0.7270347 , 0.72695591, 0.72700605, 0.7270347 , 0.72695591,
       0.72706694, 0.72680189, 0.72709559, 0.7429341 , 0.74357523,
       0.74401221, 0.74332093, 0.74404444, 0.74388326, 0.74365403,
       0.74419846, 0.74420562, 0.74315617, 0.74370418, 0.74393699,
       0.7434642 , 0.74401937, 0.74401579, 0.74262965, 0.74346062,
       0.74351792, 0.74255802, 0.74299141, 0.74286963, 0.74255802,
       0.74299141, 0.74286963, 0.74233953, 0.742601  , 0.74237176,
       0.74076714, 0.74174854, 0.74178436, 0.74294126, 0.74371492,
       0.74402295, 0.74356091, 0.74452798, 0.74442411, 0.74373283,
       0.74424502, 0.74445634, 0.74374358, 0.74412682, 0.74457812,
       0.74408742, 0.74459603, 0.74465692, 0.74315975, 0.74333167,
       0.7433854 , 0.74315975, 0.74333167, 0.7433854 , 0.74306662,
       0.74322064, 0.74347136, 0.72608196, 0.72694874, 0.72697381,
       0.72695949, 0.72775464, 0.72788   , 0.72745377, 0.72786209,
       0.72783702, 0.72779045, 0.72776538, 0.72745377, 0.72749675,
       0.72778687, 0.72773315, 0.72745019, 0.72777613, 0.72763644,
       0.72707052, 0.72733199, 0.72745377, 0.72707052, 0.72733199,
       0.72745377, 0.72762927, 0.72757555, 0.72785851, 0.74284097,
       0.74305946, 0.74323138, 0.74392266, 0.74423786, 0.74454231,
       0.74414831, 0.74481452, 0.74492555, 0.74414115, 0.74438829,
       0.74453514, 0.74413757, 0.74471065, 0.74451723, 0.74451007,
       0.74475721, 0.7445638 , 0.74373999, 0.74397639, 0.74410175,
       0.74373999, 0.74397639, 0.74410175, 0.74358956, 0.7439155 ,
       0.74385103, 0.73608221, 0.73677349, 0.73691676, 0.74067043,
       0.74112531, 0.74136171, 0.74301648, 0.74394774, 0.74405161,
       0.74328153, 0.74392983, 0.7439943 , 0.74352151, 0.74382954,
       0.74425218, 0.74445634, 0.74473572, 0.74487183, 0.74348211,
       0.7441125 , 0.74463185, 0.74348211, 0.7441125 , 0.74463185,
       0.74369701, 0.74420204, 0.74418055, 0.72690576, 0.72698814,
       0.7271529 , 0.72650819, 0.72673742, 0.72705619, 0.72718156,
       0.72721021, 0.72707768, 0.72685203, 0.72651177, 0.72677682,
       0.72697381, 0.72733557, 0.7271529 , 0.72647237, 0.72678756,
       0.72714216, 0.72687352, 0.72691651, 0.72675891, 0.72687352,
       0.72691651, 0.72675891, 0.72667653, 0.72662638, 0.72661564,
       0.74328153, 0.74416264, 0.74441336, 0.74429875, 0.74471423,
       0.74459245, 0.74450291, 0.74488257, 0.74480736, 0.74428084,
       0.7446999 , 0.74469274, 0.7439943 , 0.74470707, 0.74471781,
       0.74410891, 0.74482526, 0.74488257, 0.74405519, 0.74424502,
       0.74443844, 0.74405519, 0.74424502, 0.74443844, 0.74342122,
       0.74396564, 0.74409817, 0.73784801, 0.73831364, 0.73837453,
       0.7416769 , 0.74230729, 0.74225357, 0.74386177, 0.74401579,
       0.7444456 , 0.7437579 , 0.74446709, 0.74444202, 0.74372925,
       0.74449574, 0.74456021, 0.7447393 , 0.74502584, 0.7453482 ,
       0.7438761 , 0.74455305, 0.74467125, 0.7438761 , 0.74455305,
       0.74467125, 0.74441336, 0.74473214, 0.74478586]), 'std_test_score': array([0.0011383 , 0.00098686, 0.00082527, 0.00050995, 0.00036684,
       0.00049991, 0.0008291 , 0.00101392, 0.00096373, 0.00110867,
       0.00081422, 0.00048827, 0.0002882 , 0.00017974, 0.00040267,
       0.00049651, 0.00055225, 0.00065357, 0.00104007, 0.00071387,
       0.00059789, 0.00104007, 0.00071387, 0.00059789, 0.00038917,
       0.00050514, 0.00055694, 0.00128026, 0.00123573, 0.00112777,
       0.00111472, 0.00128469, 0.00123451, 0.00109336, 0.00168767,
       0.00144937, 0.00098779, 0.00128426, 0.00123816, 0.00142963,
       0.00116408, 0.00121513, 0.00103317, 0.00082286, 0.00084226,
       0.00040323, 0.0009691 , 0.00091491, 0.00040323, 0.0009691 ,
       0.00091491, 0.00122722, 0.00125838, 0.00129848, 0.00059453,
       0.00080846, 0.00081596, 0.00073837, 0.0004652 , 0.00061034,
       0.00052888, 0.00084352, 0.00092317, 0.00091359, 0.00090329,
       0.00072064, 0.00053352, 0.00064877, 0.00079991, 0.00115647,
       0.00132175, 0.0010696 , 0.00151581, 0.00120883, 0.00116018,
       0.00151581, 0.00120883, 0.00116018, 0.00101962, 0.0014915 ,
       0.00141746, 0.00119208, 0.00087439, 0.00071807, 0.00052615,
       0.00069859, 0.00068688, 0.00109707, 0.00125211, 0.00089393,
       0.00086852, 0.00057916, 0.00067339, 0.00072293, 0.00057856,
       0.0006989 , 0.00068957, 0.00047045, 0.00070921, 0.00111668,
       0.0005793 , 0.00059732, 0.00111668, 0.0005793 , 0.00059732,
       0.00037921, 0.00033908, 0.00035165, 0.00083144, 0.00089801,
       0.00098829, 0.00093875, 0.00072324, 0.0010792 , 0.00122431,
       0.0013652 , 0.00135997, 0.00075842, 0.00093344, 0.00110297,
       0.00112411, 0.00131035, 0.0011806 , 0.00088393, 0.00118963,
       0.00099057, 0.00105304, 0.00089091, 0.00109392, 0.00105304,
       0.00089091, 0.00109392, 0.0010309 , 0.00140226, 0.00131647,
       0.00049069, 0.00070462, 0.00057114, 0.0005172 , 0.00035081,
       0.00054183, 0.00027374, 0.00064951, 0.00066253, 0.00050631,
       0.00052004, 0.00035349, 0.00045741, 0.00090125, 0.00084723,
       0.00077669, 0.00092558, 0.00063609, 0.00100819, 0.00107015,
       0.00097762, 0.00100819, 0.00107015, 0.00097762, 0.00102799,
       0.0010012 , 0.0011079 , 0.00038483, 0.00047484, 0.00051342,
       0.00033745, 0.00065568, 0.00073408, 0.0010333 , 0.00063909,
       0.00056192, 0.00088178, 0.00077017, 0.00058499, 0.00141294,
       0.00098429, 0.00096509, 0.00131507, 0.00044722, 0.0004692 ,
       0.00130658, 0.00083732, 0.00074498, 0.00130658, 0.00083732,
       0.00074498, 0.00054093, 0.00079753, 0.00074467, 0.00062117,
       0.00049425, 0.00061709, 0.00070502, 0.00065848, 0.00090458,
       0.00130889, 0.00108392, 0.00114516, 0.00139435, 0.0009186 ,
       0.00090076, 0.00083003, 0.00090823, 0.00104014, 0.00061023,
       0.00106035, 0.00104985, 0.00112968, 0.0010466 , 0.00137042,
       0.00112968, 0.0010466 , 0.00137042, 0.00109874, 0.00107118,
       0.00084665, 0.00063183, 0.00060483, 0.00076879, 0.00064292,
       0.00043669, 0.0007811 , 0.00089047, 0.00064495, 0.00054776,
       0.0006557 , 0.00073253, 0.0005976 , 0.00069373, 0.00063131,
       0.00088408, 0.00050588, 0.00083348, 0.0007712 , 0.00140629,
       0.0011616 , 0.00088676, 0.00140629, 0.0011616 , 0.00088676,
       0.00088656, 0.00083191, 0.00092125, 0.00083806, 0.00059883,
       0.00079782, 0.00106015, 0.00069837, 0.00076753, 0.00044685,
       0.00059473, 0.0005281 , 0.00095151, 0.00068708, 0.00073449,
       0.00059844, 0.00061246, 0.0007071 , 0.00075882, 0.00059155,
       0.00050574, 0.00222771, 0.00132113, 0.00120341, 0.00222771,
       0.00132113, 0.00120341, 0.00146346, 0.00089871, 0.00100546,
       0.00093417, 0.00109243, 0.00114316, 0.00077421, 0.00072971,
       0.0006019 , 0.00105285, 0.00085223, 0.00098095, 0.00026651,
       0.00064074, 0.00093439, 0.00066874, 0.00055883, 0.00086314,
       0.00125327, 0.00116814, 0.00107622, 0.0009853 , 0.00121742,
       0.00106186, 0.0009853 , 0.00121742, 0.00106186, 0.00103089,
       0.00114798, 0.0010668 , 0.00047298, 0.00081946, 0.00078515,
       0.00078921, 0.00031862, 0.00031889, 0.00042885, 0.00057871,
       0.0005267 , 0.00074163, 0.00097379, 0.00088294, 0.00065293,
       0.00086082, 0.00068119, 0.00083743, 0.00077114, 0.0008493 ,
       0.0009272 , 0.00084187, 0.0006531 , 0.0009272 , 0.00084187,
       0.0006531 , 0.0008132 , 0.00061441, 0.00069014]), 'rank_test_score': array([305, 253, 243, 274, 257, 245, 260, 242, 235, 224, 232, 218, 230,
       236, 233, 307, 225, 240, 290, 258, 246, 290, 258, 246, 241, 233,
       231, 178, 135, 136, 163, 109,  78, 161, 108, 107, 158, 137, 139,
       176, 142, 151, 187, 142, 134, 198, 190, 185, 198, 190, 185, 193,
       188, 184, 210, 207, 206, 179, 147, 128, 117,  47,  64, 148,  76,
        89, 177, 120,  90, 121,  67,  79, 196, 174, 169, 196, 174, 169,
       200, 168, 157, 299, 285, 280, 289, 292, 262, 285, 277, 260, 314,
       298, 281, 322, 319, 317, 270, 262, 296, 281, 277, 294, 281, 277,
       294, 274, 308, 269, 167, 124,  88, 146,  83, 101, 122,  61,  59,
       156, 117,  97, 132,  85,  86, 180, 133, 127, 182, 164, 169, 182,
       164, 169, 192, 181, 189, 208, 202, 201, 166, 116,  84, 125,  35,
        47, 114,  55,  41, 111,  69,  28,  77,  26,  23, 154, 144, 140,
       154, 144, 140, 159, 153, 131, 324, 296, 285, 292, 228, 217, 248,
       219, 221, 222, 227, 248, 244, 223, 229, 252, 225, 237, 272, 255,
       248, 272, 255, 248, 238, 239, 220, 173, 160, 152,  99,  58,  33,
        65,   8,   3,  66,  51,  34,  67,  17,  36,  37,  11,  29, 112,
        93,  73, 112,  93,  73, 123, 100, 105, 216, 215, 214, 209, 205,
       204, 162,  96,  82, 148,  98,  91, 126, 106,  54,  41,  13,   6,
       129,  70,  24, 129,  70,  24, 119,  60,  62, 302, 284, 266, 321,
       313, 276, 265, 262, 270, 306, 320, 310, 285, 254, 266, 322, 309,
       268, 303, 300, 311, 303, 300, 311, 315, 316, 318, 148,  63,  49,
        52,  16,  27,  38,   4,   9,  53,  19,  20,  91,  18,  15,  72,
         7,   4,  80,  55,  45,  80,  55,  45, 138,  95,  75, 213, 212,
       211, 203, 194, 195, 104,  86,  43, 110,  40,  44, 115,  39,  30,
        12,   2,   1, 102,  31,  21, 102,  31,  21,  49,  14,  10])}
In [10]:
del sub_train_set
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
In [11]:
params = grid.best_params_
params['n_jobs'] = -1
params['verbose'] = 5
best_model = _RandomForestClassifier(**params)
best_model.get_params()
Out[11]:
{'bootstrap': False,
 'class_weight': None,
 'criterion': 'entropy',
 'max_depth': 50,
 'max_features': 'auto',
 'max_leaf_nodes': None,
 'min_impurity_decrease': 0.0,
 'min_impurity_split': None,
 'min_samples_leaf': 2,
 'min_samples_split': 10,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 1000,
 'n_jobs': -1,
 'oob_score': False,
 'random_state': None,
 'verbose': 5,
 'warm_start': False}
In [19]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 12 concurrent workers.
building tree 1 of 1000
building tree 2 of 1000
building tree 3 of 1000
building tree 4 of 1000
building tree 5 of 1000
building tree 6 of 1000
building tree 7 of 1000
building tree 8 of 1000
building tree 9 of 1000
building tree 10 of 1000
building tree 11 of 1000
building tree 12 of 1000
building tree 13 of 1000
building tree 14 of 1000
building tree 15 of 1000
building tree 16 of 1000
building tree 17 of 1000
building tree 18 of 1000
building tree 19 of 1000
building tree 20 of 1000
building tree 21 of 1000
building tree 22 of 1000
building tree 23 of 1000
building tree 24 of 1000
building tree 25 of 1000
building tree 26 of 1000
building tree 27 of 1000
building tree 28 of 1000
building tree 29 of 1000
building tree 30 of 1000
building tree 31 of 1000
building tree 32 of 1000
building tree 33 of 1000
building tree 34 of 1000
building tree 35 of 1000
building tree 36 of 1000
building tree 37 of 1000
building tree 38 of 1000
building tree 39 of 1000
building tree 40 of 1000
building tree 41 of 1000
building tree 42 of 1000
building tree 43 of 1000
building tree 44 of 1000
building tree 45 of 1000
building tree 46 of 1000
building tree 47 of 1000
building tree 48 of 1000
building tree 49 of 1000
building tree 50 of 1000
building tree 51 of 1000
building tree 52 of 1000
building tree 53 of 1000
building tree 54 of 1000
building tree 55 of 1000
building tree 56 of 1000
building tree 57 of 1000
building tree 58 of 1000
building tree 59 of 1000
building tree 60 of 1000
[Parallel(n_jobs=-1)]: Done  48 tasks      | elapsed:  2.0min
building tree 61 of 1000
building tree 62 of 1000
building tree 63 of 1000
building tree 64 of 1000
building tree 65 of 1000
building tree 66 of 1000
building tree 67 of 1000
building tree 68 of 1000
building tree 69 of 1000
building tree 70 of 1000
building tree 71 of 1000
building tree 72 of 1000
building tree 73 of 1000
building tree 74 of 1000
building tree 75 of 1000
building tree 76 of 1000
building tree 77 of 1000
building tree 78 of 1000
building tree 79 of 1000
building tree 80 of 1000
building tree 81 of 1000
building tree 82 of 1000
building tree 83 of 1000
building tree 84 of 1000
building tree 85 of 1000
building tree 86 of 1000
building tree 87 of 1000
building tree 88 of 1000
building tree 89 of 1000
building tree 90 of 1000
building tree 91 of 1000
building tree 92 of 1000
building tree 93 of 1000
building tree 94 of 1000
building tree 95 of 1000
building tree 96 of 1000
building tree 97 of 1000
building tree 98 of 1000
building tree 99 of 1000
building tree 100 of 1000
building tree 101 of 1000
building tree 102 of 1000
building tree 103 of 1000
building tree 104 of 1000
building tree 105 of 1000
building tree 106 of 1000
building tree 107 of 1000
building tree 108 of 1000
building tree 109 of 1000
building tree 110 of 1000
building tree 111 of 1000
building tree 112 of 1000
building tree 113 of 1000
building tree 114 of 1000
building tree 115 of 1000
building tree 116 of 1000
building tree 117 of 1000
building tree 118 of 1000
building tree 119 of 1000
building tree 120 of 1000
building tree 121 of 1000
building tree 122 of 1000
building tree 123 of 1000
building tree 124 of 1000
building tree 125 of 1000
building tree 126 of 1000
building tree 127 of 1000
building tree 128 of 1000
building tree 129 of 1000
building tree 130 of 1000
building tree 131 of 1000
building tree 132 of 1000
building tree 133 of 1000
building tree 134 of 1000
building tree 135 of 1000
building tree 136 of 1000
building tree 137 of 1000
building tree 138 of 1000
building tree 139 of 1000
building tree 140 of 1000
building tree 141 of 1000
building tree 142 of 1000
building tree 143 of 1000
building tree 144 of 1000
building tree 145 of 1000
building tree 146 of 1000
building tree 147 of 1000
building tree 148 of 1000
building tree 149 of 1000
building tree 150 of 1000
[Parallel(n_jobs=-1)]: Done 138 tasks      | elapsed:  5.6min
building tree 151 of 1000
building tree 152 of 1000
building tree 153 of 1000
building tree 154 of 1000
building tree 155 of 1000
building tree 156 of 1000
building tree 157 of 1000
building tree 158 of 1000
building tree 159 of 1000
building tree 160 of 1000
building tree 161 of 1000
building tree 162 of 1000
building tree 163 of 1000
building tree 164 of 1000
building tree 165 of 1000
building tree 166 of 1000
building tree 167 of 1000
building tree 168 of 1000
building tree 169 of 1000
building tree 170 of 1000
building tree 171 of 1000
building tree 172 of 1000
building tree 173 of 1000
building tree 174 of 1000
building tree 175 of 1000
building tree 176 of 1000
building tree 177 of 1000
building tree 178 of 1000
building tree 179 of 1000
building tree 180 of 1000
building tree 181 of 1000
building tree 182 of 1000
building tree 183 of 1000
building tree 184 of 1000
building tree 185 of 1000
building tree 186 of 1000
building tree 187 of 1000
building tree 188 of 1000
building tree 189 of 1000
building tree 190 of 1000
building tree 191 of 1000
building tree 192 of 1000
building tree 193 of 1000
building tree 194 of 1000
building tree 195 of 1000
building tree 196 of 1000
building tree 197 of 1000
building tree 198 of 1000
building tree 199 of 1000
building tree 200 of 1000
building tree 201 of 1000
building tree 202 of 1000
building tree 203 of 1000
building tree 204 of 1000
building tree 205 of 1000
building tree 206 of 1000
building tree 207 of 1000
building tree 208 of 1000
building tree 209 of 1000
building tree 210 of 1000
building tree 211 of 1000
building tree 212 of 1000
building tree 213 of 1000
building tree 214 of 1000
building tree 215 of 1000
building tree 216 of 1000
building tree 217 of 1000
building tree 218 of 1000
building tree 219 of 1000
building tree 220 of 1000
building tree 221 of 1000
building tree 222 of 1000
building tree 223 of 1000
building tree 224 of 1000
building tree 225 of 1000
building tree 226 of 1000
building tree 227 of 1000
building tree 228 of 1000
building tree 229 of 1000
building tree 230 of 1000
building tree 231 of 1000
building tree 232 of 1000
building tree 233 of 1000
building tree 234 of 1000
building tree 235 of 1000
building tree 236 of 1000
building tree 237 of 1000
building tree 238 of 1000
building tree 239 of 1000
building tree 240 of 1000
building tree 241 of 1000
building tree 242 of 1000
building tree 243 of 1000
building tree 244 of 1000
building tree 245 of 1000
building tree 246 of 1000
building tree 247 of 1000
building tree 248 of 1000
building tree 249 of 1000
building tree 250 of 1000
building tree 251 of 1000
building tree 252 of 1000
building tree 253 of 1000
building tree 254 of 1000
building tree 255 of 1000
building tree 256 of 1000
building tree 257 of 1000
building tree 258 of 1000
building tree 259 of 1000
building tree 260 of 1000
building tree 261 of 1000
building tree 262 of 1000
building tree 263 of 1000
building tree 264 of 1000
building tree 265 of 1000
building tree 266 of 1000
building tree 267 of 1000
building tree 268 of 1000
building tree 269 of 1000
building tree 270 of 1000
building tree 271 of 1000
building tree 272 of 1000
building tree 273 of 1000
building tree 274 of 1000
building tree 275 of 1000
[Parallel(n_jobs=-1)]: Done 264 tasks      | elapsed: 10.3min
building tree 276 of 1000
building tree 277 of 1000
building tree 278 of 1000
building tree 279 of 1000
building tree 280 of 1000
building tree 281 of 1000
building tree 282 of 1000
building tree 283 of 1000
building tree 284 of 1000
building tree 285 of 1000
building tree 286 of 1000
building tree 287 of 1000
building tree 288 of 1000
building tree 289 of 1000
building tree 290 of 1000
building tree 291 of 1000
building tree 292 of 1000
building tree 293 of 1000
building tree 294 of 1000
building tree 295 of 1000
building tree 296 of 1000
building tree 297 of 1000
building tree 298 of 1000
building tree 299 of 1000
building tree 300 of 1000
building tree 301 of 1000
building tree 302 of 1000
building tree 303 of 1000
building tree 304 of 1000
building tree 305 of 1000
building tree 306 of 1000
building tree 307 of 1000
building tree 308 of 1000
building tree 309 of 1000
building tree 310 of 1000
building tree 311 of 1000
building tree 312 of 1000
building tree 313 of 1000
building tree 314 of 1000
building tree 315 of 1000
building tree 316 of 1000
building tree 317 of 1000
building tree 318 of 1000
building tree 319 of 1000
building tree 320 of 1000
building tree 321 of 1000
building tree 322 of 1000
building tree 323 of 1000
building tree 324 of 1000
building tree 325 of 1000
building tree 326 of 1000
building tree 327 of 1000
building tree 328 of 1000
building tree 329 of 1000
building tree 330 of 1000
building tree 331 of 1000
building tree 332 of 1000
building tree 333 of 1000
building tree 334 of 1000
building tree 335 of 1000
building tree 336 of 1000
building tree 337 of 1000
building tree 338 of 1000
building tree 339 of 1000
building tree 340 of 1000
building tree 341 of 1000
building tree 342 of 1000
building tree 343 of 1000
building tree 344 of 1000
building tree 345 of 1000
building tree 346 of 1000
building tree 347 of 1000
building tree 348 of 1000
building tree 349 of 1000
building tree 350 of 1000
building tree 351 of 1000
building tree 352 of 1000
building tree 353 of 1000
building tree 354 of 1000
building tree 355 of 1000
building tree 356 of 1000
building tree 357 of 1000
building tree 358 of 1000
building tree 359 of 1000
building tree 360 of 1000
building tree 361 of 1000
building tree 362 of 1000
building tree 363 of 1000
building tree 364 of 1000
building tree 365 of 1000
building tree 366 of 1000
building tree 367 of 1000
building tree 368 of 1000
building tree 369 of 1000
building tree 370 of 1000
building tree 371 of 1000
building tree 372 of 1000
building tree 373 of 1000
building tree 374 of 1000
building tree 375 of 1000
building tree 376 of 1000
building tree 377 of 1000
building tree 378 of 1000
building tree 379 of 1000
building tree 380 of 1000
building tree 381 of 1000
building tree 382 of 1000
building tree 383 of 1000
building tree 384 of 1000
building tree 385 of 1000
building tree 386 of 1000
building tree 387 of 1000
building tree 388 of 1000
building tree 389 of 1000
building tree 390 of 1000
building tree 391 of 1000
building tree 392 of 1000
building tree 393 of 1000
building tree 394 of 1000
building tree 395 of 1000
building tree 396 of 1000
building tree 397 of 1000
building tree 398 of 1000
building tree 399 of 1000
building tree 400 of 1000
building tree 401 of 1000
building tree 402 of 1000
building tree 403 of 1000
building tree 404 of 1000
building tree 405 of 1000
building tree 406 of 1000
building tree 407 of 1000
building tree 408 of 1000
building tree 409 of 1000
building tree 410 of 1000
building tree 411 of 1000
building tree 412 of 1000
building tree 413 of 1000
building tree 414 of 1000
building tree 415 of 1000
building tree 416 of 1000
building tree 417 of 1000
building tree 418 of 1000
building tree 419 of 1000
building tree 420 of 1000
building tree 421 of 1000
building tree 422 of 1000
building tree 423 of 1000
building tree 424 of 1000
building tree 425 of 1000
building tree 426 of 1000
building tree 427 of 1000
building tree 428 of 1000
building tree 429 of 1000
building tree 430 of 1000
building tree 431 of 1000
building tree 432 of 1000
building tree 433 of 1000
building tree 434 of 1000
building tree 435 of 1000
building tree 436 of 1000
building tree 437 of 1000
[Parallel(n_jobs=-1)]: Done 426 tasks      | elapsed: 16.2min
building tree 438 of 1000
building tree 439 of 1000
building tree 440 of 1000
building tree 441 of 1000
building tree 442 of 1000
building tree 443 of 1000
building tree 444 of 1000
building tree 445 of 1000
building tree 446 of 1000
building tree 447 of 1000
building tree 448 of 1000
building tree 449 of 1000
building tree 450 of 1000
building tree 451 of 1000
building tree 452 of 1000
building tree 453 of 1000
building tree 454 of 1000
building tree 455 of 1000
building tree 456 of 1000
building tree 457 of 1000
building tree 458 of 1000
building tree 459 of 1000
building tree 460 of 1000
building tree 461 of 1000
building tree 462 of 1000
building tree 463 of 1000
building tree 464 of 1000
building tree 465 of 1000
building tree 466 of 1000
building tree 467 of 1000
building tree 468 of 1000
building tree 469 of 1000
building tree 470 of 1000
building tree 471 of 1000
building tree 472 of 1000
building tree 473 of 1000
building tree 474 of 1000
building tree 475 of 1000
building tree 476 of 1000
building tree 477 of 1000
building tree 478 of 1000
building tree 479 of 1000
building tree 480 of 1000
building tree 481 of 1000
building tree 482 of 1000
building tree 483 of 1000
building tree 484 of 1000
building tree 485 of 1000
building tree 486 of 1000
building tree 487 of 1000
building tree 488 of 1000
building tree 489 of 1000
building tree 490 of 1000
building tree 491 of 1000
building tree 492 of 1000
building tree 493 of 1000
building tree 494 of 1000
building tree 495 of 1000
building tree 496 of 1000
building tree 497 of 1000
building tree 498 of 1000
building tree 499 of 1000
building tree 500 of 1000
building tree 501 of 1000
building tree 502 of 1000
building tree 503 of 1000
building tree 504 of 1000
building tree 505 of 1000
building tree 506 of 1000
building tree 507 of 1000
building tree 508 of 1000
building tree 509 of 1000
building tree 510 of 1000
building tree 511 of 1000
building tree 512 of 1000
building tree 513 of 1000
building tree 514 of 1000
building tree 515 of 1000
building tree 516 of 1000
building tree 517 of 1000
building tree 518 of 1000
building tree 519 of 1000
building tree 520 of 1000
building tree 521 of 1000
building tree 522 of 1000
building tree 523 of 1000
building tree 524 of 1000
building tree 525 of 1000
building tree 526 of 1000
building tree 527 of 1000
building tree 528 of 1000
building tree 529 of 1000
building tree 530 of 1000
building tree 531 of 1000
building tree 532 of 1000
building tree 533 of 1000
building tree 534 of 1000
building tree 535 of 1000
building tree 536 of 1000
building tree 537 of 1000
building tree 538 of 1000
building tree 539 of 1000
building tree 540 of 1000
building tree 541 of 1000
building tree 542 of 1000
building tree 543 of 1000
building tree 544 of 1000
building tree 545 of 1000
building tree 546 of 1000
building tree 547 of 1000
building tree 548 of 1000
building tree 549 of 1000
building tree 550 of 1000
building tree 551 of 1000
building tree 552 of 1000
building tree 553 of 1000
building tree 554 of 1000
building tree 555 of 1000
building tree 556 of 1000
building tree 557 of 1000
building tree 558 of 1000
building tree 559 of 1000
building tree 560 of 1000
building tree 561 of 1000
building tree 562 of 1000
building tree 563 of 1000
building tree 564 of 1000
building tree 565 of 1000
building tree 566 of 1000
building tree 567 of 1000
building tree 568 of 1000
building tree 569 of 1000
building tree 570 of 1000
building tree 571 of 1000
building tree 572 of 1000
building tree 573 of 1000
building tree 574 of 1000
building tree 575 of 1000
building tree 576 of 1000
building tree 577 of 1000
building tree 578 of 1000
building tree 579 of 1000
building tree 580 of 1000
building tree 581 of 1000
building tree 582 of 1000
building tree 583 of 1000
building tree 584 of 1000
building tree 585 of 1000
building tree 586 of 1000
building tree 587 of 1000
building tree 588 of 1000
building tree 589 of 1000
building tree 590 of 1000
building tree 591 of 1000
building tree 592 of 1000
building tree 593 of 1000
building tree 594 of 1000
building tree 595 of 1000
building tree 596 of 1000
building tree 597 of 1000
building tree 598 of 1000
building tree 599 of 1000
building tree 600 of 1000
building tree 601 of 1000
building tree 602 of 1000
building tree 603 of 1000
building tree 604 of 1000
building tree 605 of 1000
building tree 606 of 1000
building tree 607 of 1000
building tree 608 of 1000
building tree 609 of 1000
building tree 610 of 1000
building tree 611 of 1000
building tree 612 of 1000
building tree 613 of 1000
building tree 614 of 1000
building tree 615 of 1000
building tree 616 of 1000
building tree 617 of 1000
building tree 618 of 1000
building tree 619 of 1000
building tree 620 of 1000
building tree 621 of 1000
building tree 622 of 1000
building tree 623 of 1000
building tree 624 of 1000
building tree 625 of 1000
building tree 626 of 1000
building tree 627 of 1000
building tree 628 of 1000
building tree 629 of 1000
building tree 630 of 1000
building tree 631 of 1000
building tree 632 of 1000
building tree 633 of 1000
building tree 634 of 1000
building tree 635 of 1000
[Parallel(n_jobs=-1)]: Done 624 tasks      | elapsed: 23.3min
building tree 636 of 1000
building tree 637 of 1000
building tree 638 of 1000
building tree 639 of 1000
building tree 640 of 1000
building tree 641 of 1000
building tree 642 of 1000
building tree 643 of 1000
building tree 644 of 1000
building tree 645 of 1000
building tree 646 of 1000
building tree 647 of 1000
building tree 648 of 1000
building tree 649 of 1000
building tree 650 of 1000
building tree 651 of 1000
building tree 652 of 1000
building tree 653 of 1000
building tree 654 of 1000
building tree 655 of 1000
building tree 656 of 1000
building tree 657 of 1000
building tree 658 of 1000
building tree 659 of 1000
building tree 660 of 1000
building tree 661 of 1000
building tree 662 of 1000
building tree 663 of 1000
building tree 664 of 1000
building tree 665 of 1000
building tree 666 of 1000
building tree 667 of 1000
building tree 668 of 1000
building tree 669 of 1000
building tree 670 of 1000
building tree 671 of 1000
building tree 672 of 1000
building tree 673 of 1000
building tree 674 of 1000
building tree 675 of 1000
building tree 676 of 1000
building tree 677 of 1000
building tree 678 of 1000
building tree 679 of 1000
building tree 680 of 1000
building tree 681 of 1000
building tree 682 of 1000
building tree 683 of 1000
building tree 684 of 1000
building tree 685 of 1000
building tree 686 of 1000
building tree 687 of 1000
building tree 688 of 1000
building tree 689 of 1000
building tree 690 of 1000
building tree 691 of 1000
building tree 692 of 1000
building tree 693 of 1000
building tree 694 of 1000
building tree 695 of 1000
building tree 696 of 1000
building tree 697 of 1000
building tree 698 of 1000
building tree 699 of 1000
building tree 700 of 1000
building tree 701 of 1000
building tree 702 of 1000
building tree 703 of 1000
building tree 704 of 1000
building tree 705 of 1000
building tree 706 of 1000
building tree 707 of 1000
building tree 708 of 1000
building tree 709 of 1000
building tree 710 of 1000
building tree 711 of 1000
building tree 712 of 1000
building tree 713 of 1000
building tree 714 of 1000
building tree 715 of 1000
building tree 716 of 1000
building tree 717 of 1000
building tree 718 of 1000
building tree 719 of 1000
building tree 720 of 1000
building tree 721 of 1000
building tree 722 of 1000
building tree 723 of 1000
building tree 724 of 1000
building tree 725 of 1000
building tree 726 of 1000
building tree 727 of 1000
building tree 728 of 1000
building tree 729 of 1000
building tree 730 of 1000
building tree 731 of 1000
building tree 732 of 1000
building tree 733 of 1000
building tree 734 of 1000
building tree 735 of 1000
building tree 736 of 1000
building tree 737 of 1000
building tree 738 of 1000
building tree 739 of 1000
building tree 740 of 1000
building tree 741 of 1000
building tree 742 of 1000
building tree 743 of 1000
building tree 744 of 1000
building tree 745 of 1000
building tree 746 of 1000
building tree 747 of 1000
building tree 748 of 1000
building tree 749 of 1000
building tree 750 of 1000
building tree 751 of 1000
building tree 752 of 1000
building tree 753 of 1000
building tree 754 of 1000
building tree 755 of 1000
building tree 756 of 1000
building tree 757 of 1000
building tree 758 of 1000
building tree 759 of 1000
building tree 760 of 1000
building tree 761 of 1000
building tree 762 of 1000
building tree 763 of 1000
building tree 764 of 1000
building tree 765 of 1000
building tree 766 of 1000
building tree 767 of 1000
building tree 768 of 1000
building tree 769 of 1000
building tree 770 of 1000
building tree 771 of 1000
building tree 772 of 1000
building tree 773 of 1000
building tree 774 of 1000
building tree 775 of 1000
building tree 776 of 1000
building tree 777 of 1000
building tree 778 of 1000
building tree 779 of 1000
building tree 780 of 1000
building tree 781 of 1000
building tree 782 of 1000
building tree 783 of 1000
building tree 784 of 1000
building tree 785 of 1000
building tree 786 of 1000
building tree 787 of 1000
building tree 788 of 1000
building tree 789 of 1000
building tree 790 of 1000
building tree 791 of 1000
building tree 792 of 1000
building tree 793 of 1000
building tree 794 of 1000
building tree 795 of 1000
building tree 796 of 1000
building tree 797 of 1000
building tree 798 of 1000
building tree 799 of 1000
building tree 800 of 1000
building tree 801 of 1000
building tree 802 of 1000
building tree 803 of 1000
building tree 804 of 1000
building tree 805 of 1000
building tree 806 of 1000
building tree 807 of 1000
building tree 808 of 1000
building tree 809 of 1000
building tree 810 of 1000
building tree 811 of 1000
building tree 812 of 1000
building tree 813 of 1000
building tree 814 of 1000
building tree 815 of 1000
building tree 816 of 1000
building tree 817 of 1000
building tree 818 of 1000
building tree 819 of 1000
building tree 820 of 1000
building tree 821 of 1000
building tree 822 of 1000
building tree 823 of 1000
building tree 824 of 1000
building tree 825 of 1000
building tree 826 of 1000
building tree 827 of 1000
building tree 828 of 1000
building tree 829 of 1000
building tree 830 of 1000
building tree 831 of 1000
building tree 832 of 1000
building tree 833 of 1000
building tree 834 of 1000
building tree 835 of 1000
building tree 836 of 1000
building tree 837 of 1000
building tree 838 of 1000
building tree 839 of 1000
building tree 840 of 1000
building tree 841 of 1000
building tree 842 of 1000
building tree 843 of 1000
building tree 844 of 1000
building tree 845 of 1000
building tree 846 of 1000
building tree 847 of 1000
building tree 848 of 1000
building tree 849 of 1000
building tree 850 of 1000
building tree 851 of 1000
building tree 852 of 1000
building tree 853 of 1000
building tree 854 of 1000
building tree 855 of 1000
building tree 856 of 1000
building tree 857 of 1000
building tree 858 of 1000
building tree 859 of 1000
building tree 860 of 1000
building tree 861 of 1000
building tree 862 of 1000
building tree 863 of 1000
building tree 864 of 1000
building tree 865 of 1000
building tree 866 of 1000
building tree 867 of 1000
building tree 868 of 1000
building tree 869 of 1000
[Parallel(n_jobs=-1)]: Done 858 tasks      | elapsed: 31.8min
building tree 870 of 1000
building tree 871 of 1000
building tree 872 of 1000
building tree 873 of 1000
building tree 874 of 1000
building tree 875 of 1000
building tree 876 of 1000
building tree 877 of 1000
building tree 878 of 1000
building tree 879 of 1000
building tree 880 of 1000
building tree 881 of 1000
building tree 882 of 1000
building tree 883 of 1000
building tree 884 of 1000
building tree 885 of 1000
building tree 886 of 1000
building tree 887 of 1000
building tree 888 of 1000
building tree 889 of 1000
building tree 890 of 1000
building tree 891 of 1000
building tree 892 of 1000
building tree 893 of 1000
building tree 894 of 1000
building tree 895 of 1000
building tree 896 of 1000
building tree 897 of 1000
building tree 898 of 1000
building tree 899 of 1000
building tree 900 of 1000
building tree 901 of 1000
building tree 902 of 1000
building tree 903 of 1000
building tree 904 of 1000
building tree 905 of 1000
building tree 906 of 1000
building tree 907 of 1000
building tree 908 of 1000
building tree 909 of 1000
building tree 910 of 1000
building tree 911 of 1000
building tree 912 of 1000
building tree 913 of 1000
building tree 914 of 1000
building tree 915 of 1000
building tree 916 of 1000
building tree 917 of 1000
building tree 918 of 1000
building tree 919 of 1000
building tree 920 of 1000
building tree 921 of 1000
building tree 922 of 1000
building tree 923 of 1000
building tree 924 of 1000
building tree 925 of 1000
building tree 926 of 1000
building tree 927 of 1000
building tree 928 of 1000
building tree 929 of 1000
building tree 930 of 1000
building tree 931 of 1000
building tree 932 of 1000
building tree 933 of 1000
building tree 934 of 1000
building tree 935 of 1000
building tree 936 of 1000
building tree 937 of 1000
building tree 938 of 1000
building tree 939 of 1000
building tree 940 of 1000
building tree 941 of 1000
building tree 942 of 1000
building tree 943 of 1000
building tree 944 of 1000
building tree 945 of 1000
building tree 946 of 1000
building tree 947 of 1000
building tree 948 of 1000
building tree 949 of 1000
building tree 950 of 1000
building tree 951 of 1000
building tree 952 of 1000
building tree 953 of 1000
building tree 954 of 1000
building tree 955 of 1000
building tree 956 of 1000
building tree 957 of 1000
building tree 958 of 1000
building tree 959 of 1000
building tree 960 of 1000
building tree 961 of 1000
building tree 962 of 1000
building tree 963 of 1000
building tree 964 of 1000
building tree 965 of 1000
building tree 966 of 1000
building tree 967 of 1000
building tree 968 of 1000
building tree 969 of 1000
building tree 970 of 1000
building tree 971 of 1000
building tree 972 of 1000
building tree 973 of 1000
building tree 974 of 1000
building tree 975 of 1000
building tree 976 of 1000
building tree 977 of 1000
building tree 978 of 1000
building tree 979 of 1000
building tree 980 of 1000
building tree 981 of 1000
building tree 982 of 1000
building tree 983 of 1000
building tree 984 of 1000
building tree 985 of 1000
building tree 986 of 1000
building tree 987 of 1000
building tree 988 of 1000
building tree 989 of 1000
building tree 990 of 1000
building tree 991 of 1000
building tree 992 of 1000
building tree 993 of 1000
building tree 994 of 1000
building tree 995 of 1000
building tree 996 of 1000
building tree 997 of 1000
building tree 998 of 1000
building tree 999 of 1000
building tree 1000 of 1000
[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed: 36.7min finished
Out[19]:
RandomForestClassifier(bootstrap=False, class_weight=None, criterion='entropy',
                       max_depth=50, max_features='auto', max_leaf_nodes=None,
                       min_impurity_decrease=0.0, min_impurity_split=None,
                       min_samples_leaf=2, min_samples_split=10,
                       min_weight_fraction_leaf=0.0, n_estimators=1000,
                       n_jobs=-1, oob_score=False, random_state=None, verbose=5,
                       warm_start=False)
In [20]:
_jl.dump(best_model, "../models/best_Random_Forest_2.joblib")
Out[20]:
['../models/best_Random_Forest_2.joblib']
In [21]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set.head()
Out[21]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Allison Park city_Amherst city_Aurora city_Avon city_Avondale city_Beachwood city_Bellevue city_Belmont city_Berea city_Bethel Park city_Blue Diamond city_Boulder City city_Braddock city_Brampton city_Brecksville city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklyn city_Brossard city_Brunswick city_Buckeye city_Calgary city_Canonsburg city_Carefree city_Carnegie city_Cave Creek city_Chagrin Falls city_Champaign city_Chandler city_Chardon city_Charlotte city_Chesterland city_Cleveland city_Clover city_Concord city_Coraopolis city_Cornelius city_Cuyahoga Falls city_Davidson city_Denver city_Dollard-des-Ormeaux city_Dorval city_East York city_El Mirage city_Elyria city_Etobicoke city_Euclid city_Fairlawn city_Fairview Park city_Fitchburg city_Fort Mill city_Fountain Hills city_Gastonia city_Gilbert city_Glendale city_Goodyear city_Harrisburg city_Henderson city_Highland Heights city_Homestead city_Hudson city_Huntersville city_Independence city_Indian Land city_Indian Trail city_Irwin city_Kannapolis city_Kent city_Lake Wylie city_Lakewood city_Las Vegas city_Laval city_Laveen city_Litchfield Park city_Longueuil city_Lorain city_Lyndhurst city_Macedonia city_Madison city_Maple city_Markham city_Matthews city_Mayfield Heights city_McKees Rocks city_McMurray city_Medina city_Mentor city_Mesa city_Middleburg Heights city_Middleton city_Mint Hill city_Mississauga city_Monona city_Monroe city_Monroeville city_Montreal city_Montréal city_Moon Township city_Mooresville city_Mount Holly city_Murrysville city_New Kensington city_Newmarket city_North Las Vegas city_North Olmsted city_North Ridgeville city_North Royalton city_North York city_Northfield city_Oakmont city_Oakville city_Olmsted Falls city_Orange city_Orange Village city_Other city_Painesville city_Paradise Valley city_Parma city_Peoria city_Phoenix city_Pickering city_Pineville city_Pittsburgh city_Pointe-Claire city_Queen Creek city_Richmond Hill city_Rock Hill city_Rocky River city_Saint-Laurent city_Scarborough city_Scottsdale city_Seven Hills city_Sewickley city_Solon city_South Euclid city_South Las Vegas city_Spring Valley city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sun City city_Sun Prairie city_Surprise city_Tega Cay city_Tempe city_Thornhill city_Tolleson city_Toronto city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Urbana city_Valley View city_Vaughan city_Verdun city_Verona city_Warrensville Heights city_Waunakee city_Waxhaw city_West Mifflin city_Westlake city_Westmount city_Wexford city_Whitby city_Willoughby city_Woodbridge city_Woodmere city_York categories_ Acai Bowls categories_ Active Life categories_ Adult Entertainment categories_ Afghan categories_ African categories_ Airports categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Shelters categories_ Antiques categories_ Appliances categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Galleries categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Automotive categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Barbeque categories_ Barbers categories_ Bars categories_ Basque categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Belgian categories_ Beverage Store categories_ Bistros categories_ Books categories_ Botanical Gardens categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ British categories_ Bubble Tea categories_ Buffets categories_ Burgers categories_ Butcher categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Canadian (New) categories_ Candy Stores categories_ Cantonese categories_ Car Wash categories_ Caribbean categories_ Casinos categories_ Caterers categories_ Cheese Shops categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chocolatiers & Shops categories_ Cinema categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee Roasteries categories_ Colombian categories_ Comfort Food categories_ Community Service/Non-Profit categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Country Dance Halls categories_ Creperies categories_ Cuban categories_ Cupcakes categories_ Custom Cakes categories_ Dance Clubs categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Department Stores categories_ Desserts categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Distilleries categories_ Dive Bars categories_ Do-It-Yourself Food categories_ Dominican categories_ Donairs categories_ Donuts categories_ Drugstores categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Empanadas categories_ Employment Agencies categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyelash Service categories_ Falafel categories_ Farmers Market categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Fish & Chips categories_ Fitness & Instruction categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ French categories_ Fruits & Veggies categories_ Furniture Stores categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ German categories_ Gift Shops categories_ Gluten-Free categories_ Golf categories_ Greek categories_ Grocery categories_ Guamanian categories_ Hair Removal categories_ Hair Salons categories_ Hakka categories_ Halal categories_ Hawaiian categories_ Health & Medical categories_ Health Markets categories_ Himalayan/Nepalese categories_ Home & Garden categories_ Home Decor categories_ Home Services categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Ice Cream & Frozen Yogurt categories_ Imported Food categories_ Indian categories_ Indonesian categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Italian categories_ Izakaya categories_ Japanese categories_ Jazz & Blues categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kitchen & Bath categories_ Kombucha categories_ Korean categories_ Kosher categories_ Landmarks & Historical Buildings categories_ Laotian categories_ Latin American categories_ Lebanese categories_ Live/Raw Food categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Mags categories_ Malaysian categories_ Massage categories_ Meat Shops categories_ Mediterranean categories_ Mexican categories_ Middle Eastern categories_ Mini Golf categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Museums categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nightlife categories_ Noodles categories_ Nutritionists categories_ Organic Stores categories_ Outlet Stores categories_ Pakistani categories_ Pan Asian categories_ Party & Event Planning categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Chefs categories_ Personal Shopping categories_ Peruvian categories_ Pets categories_ Piano Bars categories_ Pizza categories_ Playgrounds categories_ Poke categories_ Polish categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Portuguese categories_ Poutineries categories_ Pretzels categories_ Professional Services categories_ Public Markets categories_ Public Services & Government categories_ Pubs categories_ Puerto Rican categories_ Ramen categories_ Real Estate categories_ Resorts categories_ Russian categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Seafood categories_ Shanghainese categories_ Shaved Ice categories_ Shaved Snow categories_ Shopping categories_ Singaporean categories_ Skin Care categories_ Smokehouse categories_ Social Clubs categories_ Soul Food categories_ Soup categories_ Southern categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Sporting Goods categories_ Sports Bars categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Vendors categories_ Sushi Bars categories_ Swimming Pools categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Taiwanese categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tea Rooms categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Tiki Bars categories_ Tobacco Shops categories_ Tours categories_ Towing categories_ Turkish categories_ Tuscan categories_ Ukrainian categories_ Vegan categories_ Vegetarian categories_ Venezuelan categories_ Venues & Event Spaces categories_ Vietnamese categories_ Waffles categories_ Waxing categories_ Wedding Chapels categories_ Wedding Planning categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wine & Spirits categories_ Wine Bars categories_ Wineries categories_ Wraps categories_ Zoos categories_Acai Bowls categories_Active Life categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Appliances categories_Arabian categories_Arcades categories_Argentine categories_Armenian categories_Arts & Entertainment categories_Asian Fusion categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Barbeque categories_Bars categories_Basque categories_Beauty & Spas categories_Beer categories_Belgian categories_Beverage Store categories_Bistros categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_Brewpubs categories_British categories_Bubble Tea categories_Buffets categories_Burgers categories_Butcher categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Cambodian categories_Canadian (New) categories_Cantonese categories_Caribbean categories_Caterers categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Coffee & Tea categories_Coffee Roasteries categories_Colombian categories_Comfort Food categories_Convenience Stores categories_Cooking Classes categories_Creperies categories_Cuban categories_Custom Cakes categories_Day Spas categories_Delicatessen categories_Delis categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Do-It-Yourself Food categories_Donuts categories_Education categories_Ethiopian categories_Ethnic Grocery categories_Event Planning & Services categories_Falafel categories_Farmers Market categories_Fashion categories_Fast Food categories_Filipino categories_Fish & Chips categories_Fondue categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Tours categories_Food Trucks categories_French categories_Furniture Stores categories_Gas Stations categories_Gastropubs categories_Gelato categories_German categories_Gluten-Free categories_Greek categories_Grocery categories_Hair Removal categories_Hair Salons categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Himalayan/Nepalese categories_Home & Garden categories_Home Decor categories_Home Services categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Italian categories_Izakaya categories_Japanese categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Korean categories_Kosher categories_Landmarks & Historical Buildings categories_Laotian categories_Latin American categories_Lebanese categories_Live/Raw Food categories_Local Flavor categories_Local Services categories_Mags categories_Malaysian categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Modern European categories_Mongolian categories_Moroccan categories_Music Venues categories_Nail Salons categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Organic Stores categories_Other categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Party & Event Planning categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Peruvian categories_Pets categories_Pizza categories_Poke categories_Polish categories_Pop-Up Restaurants categories_Portuguese categories_Poutineries categories_Public Markets categories_Public Services & Government categories_Ramen categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Seafood categories_Shanghainese categories_Shaved Ice categories_Shaved Snow categories_Shopping categories_Singaporean categories_Smokehouse categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Specialty Food categories_Sri Lankan categories_Steakhouses categories_Street Vendors categories_Sushi Bars categories_Szechuan categories_Tabletop Games categories_Tacos categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Tex-Mex categories_Thai categories_Tobacco Shops categories_Towing categories_Turkish categories_Ukrainian categories_Vegan categories_Vegetarian categories_Venezuelan categories_Venues & Event Spaces categories_Vietnamese categories_Waffles categories_Wedding Planning categories_Whiskey Bars categories_Wholesale Stores categories_Wineries categories_Wraps
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 1 0.997555 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 67 1 1 16 6 2 3574 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 -1 0.553523 1.800000 2.000000 1.799679 1.838975 2.007105 1.777964 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 2.875000 86 0 1 34 9 12 166 16.0 3.000000 15.0 2.909282 13.563981 1.800000 2.20000 3.686094 4.500000 3.333333 1.000000 3.000000 3.933014 3.868171 3.770015 2.000000 2.500000 3.662669 4.500000 3.333333 1.00000 3.000000 3.904608 3.851784 3.744434 1.799679 2.460018 3.678871 4.571695 3.355656 1.000000 2.924220 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 1 0.990602 4.300000 4.333333 4.299574 4.349620 4.302949 4.288981 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 4.108108 227 2 7 99 47 30 286 37.0 4.161290 31.0 4.130733 31.326167 4.300000 3.75000 3.686094 3.500000 4.454545 3.666667 3.800000 3.933014 4.000000 4.000000 4.333333 3.750000 3.662669 3.000000 4.555556 3.50000 4.000000 3.904608 4.000000 4.000000 4.299574 3.724926 3.678871 3.340936 4.538601 3.626374 3.946442 3.928912 4.00000 4.000000 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 1 0.968214 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 1 0 0 0 0 0 2110 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 1 0.995667 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 59 0 0 5 0 1 22 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In [22]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
[Parallel(n_jobs=12)]: Using backend ThreadingBackend with 12 concurrent workers.
[Parallel(n_jobs=12)]: Done  48 tasks      | elapsed:    1.8s
[Parallel(n_jobs=12)]: Done 138 tasks      | elapsed:    4.8s
[Parallel(n_jobs=12)]: Done 264 tasks      | elapsed:    8.9s
[Parallel(n_jobs=12)]: Done 426 tasks      | elapsed:   13.3s
[Parallel(n_jobs=12)]: Done 624 tasks      | elapsed:   18.8s
[Parallel(n_jobs=12)]: Done 858 tasks      | elapsed:   25.2s
[Parallel(n_jobs=12)]: Done 1000 out of 1000 | elapsed:   28.9s finished
predictions:
 [0 0 1 ... 0 1 0]
In [23]:
set(predic)
Out[23]:
{0, 1}
In [24]:
# evaluate classifier

print("Report for Random Forest classifier:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Random Forest Classifier:", _accuracy_score(test_set['likes'], predic)*100)
Report for Random Forest classifier:
              precision    recall  f1-score   support

           0       0.69      0.40      0.51     50930
           1       0.75      0.91      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.66      0.67    153993
weighted avg       0.73      0.74      0.72    153993

Accuracy for Random Forest Classifier: 74.16765697142078
In [25]:
# Confusion matrix for Random Forest

print("Confusion Matrix for Random Forest: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for Random Forest before balance the data: 
Out[25]:
array([[20372, 30558],
       [ 9222, 93841]], dtype=int64)
In [26]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("Random Forest ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [ ]:
%reset

6.3 Deep Learning

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.head()
Out[3]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Allison Park city_Amherst city_Aurora city_Avon city_Avondale city_Beachwood city_Bellevue city_Belmont city_Berea city_Bethel Park city_Blue Diamond city_Boulder City city_Braddock city_Brampton city_Brecksville city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklyn city_Brossard city_Brunswick city_Buckeye city_Calgary city_Canonsburg city_Carefree city_Carnegie city_Cave Creek city_Chagrin Falls city_Champaign city_Chandler city_Chardon city_Charlotte city_Chesterland city_Cleveland city_Clover city_Concord city_Coraopolis city_Cornelius city_Cuyahoga Falls city_Davidson city_Denver city_Dollard-des-Ormeaux city_Dorval city_East York city_El Mirage city_Elyria city_Etobicoke city_Euclid city_Fairlawn city_Fairview Park city_Fitchburg city_Fort Mill city_Fountain Hills city_Gastonia city_Gilbert city_Glendale city_Goodyear city_Harrisburg city_Henderson city_Highland Heights city_Homestead city_Hudson city_Huntersville city_Independence city_Indian Land city_Indian Trail city_Irwin city_Kannapolis city_Kent city_Lake Wylie city_Lakewood city_Las Vegas city_Laval city_Laveen city_Litchfield Park city_Longueuil city_Lorain city_Lyndhurst city_Macedonia city_Madison city_Maple city_Markham city_Matthews city_Mayfield Heights city_McKees Rocks city_McMurray city_Medina city_Mentor city_Mesa city_Middleburg Heights city_Middleton city_Mint Hill city_Mississauga city_Monona city_Monroe city_Monroeville city_Montreal city_Montréal city_Moon Township city_Mooresville city_Mount Holly city_Murrysville city_New Kensington city_Newmarket city_North Las Vegas city_North Olmsted city_North Ridgeville city_North Royalton city_North York city_Northfield city_Oakmont city_Oakville city_Olmsted Falls city_Orange city_Orange Village city_Other city_Painesville city_Paradise Valley city_Parma city_Peoria city_Phoenix city_Pickering city_Pineville city_Pittsburgh city_Pointe-Claire city_Queen Creek city_Richmond Hill city_Rock Hill city_Rocky River city_Saint-Laurent city_Scarborough city_Scottsdale city_Seven Hills city_Sewickley city_Solon city_South Euclid city_South Las Vegas city_Spring Valley city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sun City city_Sun Prairie city_Surprise city_Tega Cay city_Tempe city_Thornhill city_Tolleson city_Toronto city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Urbana city_Valley View city_Vaughan city_Verdun city_Verona city_Warrensville Heights city_Waunakee city_Waxhaw city_West Mifflin city_Westlake city_Westmount city_Wexford city_Whitby city_Willoughby city_Woodbridge city_Woodmere city_York categories_ Acai Bowls categories_ Active Life categories_ Adult Entertainment categories_ Afghan categories_ African categories_ Airports categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Shelters categories_ Antiques categories_ Appliances categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Galleries categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Automotive categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Barbeque categories_ Barbers categories_ Bars categories_ Basque categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Belgian categories_ Beverage Store categories_ Bistros categories_ Books categories_ Botanical Gardens categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ British categories_ Bubble Tea categories_ Buffets categories_ Burgers categories_ Butcher categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Canadian (New) categories_ Candy Stores categories_ Cantonese categories_ Car Wash categories_ Caribbean categories_ Casinos categories_ Caterers categories_ Cheese Shops categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chocolatiers & Shops categories_ Cinema categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee Roasteries categories_ Colombian categories_ Comfort Food categories_ Community Service/Non-Profit categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Country Dance Halls categories_ Creperies categories_ Cuban categories_ Cupcakes categories_ Custom Cakes categories_ Dance Clubs categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Department Stores categories_ Desserts categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Distilleries categories_ Dive Bars categories_ Do-It-Yourself Food categories_ Dominican categories_ Donairs categories_ Donuts categories_ Drugstores categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Empanadas categories_ Employment Agencies categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyelash Service categories_ Falafel categories_ Farmers Market categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Fish & Chips categories_ Fitness & Instruction categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ French categories_ Fruits & Veggies categories_ Furniture Stores categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ German categories_ Gift Shops categories_ Gluten-Free categories_ Golf categories_ Greek categories_ Grocery categories_ Guamanian categories_ Hair Removal categories_ Hair Salons categories_ Hakka categories_ Halal categories_ Hawaiian categories_ Health & Medical categories_ Health Markets categories_ Himalayan/Nepalese categories_ Home & Garden categories_ Home Decor categories_ Home Services categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Ice Cream & Frozen Yogurt categories_ Imported Food categories_ Indian categories_ Indonesian categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Italian categories_ Izakaya categories_ Japanese categories_ Jazz & Blues categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kitchen & Bath categories_ Kombucha categories_ Korean categories_ Kosher categories_ Landmarks & Historical Buildings categories_ Laotian categories_ Latin American categories_ Lebanese categories_ Live/Raw Food categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Mags categories_ Malaysian categories_ Massage categories_ Meat Shops categories_ Mediterranean categories_ Mexican categories_ Middle Eastern categories_ Mini Golf categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Museums categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nightlife categories_ Noodles categories_ Nutritionists categories_ Organic Stores categories_ Outlet Stores categories_ Pakistani categories_ Pan Asian categories_ Party & Event Planning categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Chefs categories_ Personal Shopping categories_ Peruvian categories_ Pets categories_ Piano Bars categories_ Pizza categories_ Playgrounds categories_ Poke categories_ Polish categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Portuguese categories_ Poutineries categories_ Pretzels categories_ Professional Services categories_ Public Markets categories_ Public Services & Government categories_ Pubs categories_ Puerto Rican categories_ Ramen categories_ Real Estate categories_ Resorts categories_ Russian categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Seafood categories_ Shanghainese categories_ Shaved Ice categories_ Shaved Snow categories_ Shopping categories_ Singaporean categories_ Skin Care categories_ Smokehouse categories_ Social Clubs categories_ Soul Food categories_ Soup categories_ Southern categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Sporting Goods categories_ Sports Bars categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Vendors categories_ Sushi Bars categories_ Swimming Pools categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Taiwanese categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tea Rooms categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Tiki Bars categories_ Tobacco Shops categories_ Tours categories_ Towing categories_ Turkish categories_ Tuscan categories_ Ukrainian categories_ Vegan categories_ Vegetarian categories_ Venezuelan categories_ Venues & Event Spaces categories_ Vietnamese categories_ Waffles categories_ Waxing categories_ Wedding Chapels categories_ Wedding Planning categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wine & Spirits categories_ Wine Bars categories_ Wineries categories_ Wraps categories_ Zoos categories_Acai Bowls categories_Active Life categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Appliances categories_Arabian categories_Arcades categories_Argentine categories_Armenian categories_Arts & Entertainment categories_Asian Fusion categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Barbeque categories_Bars categories_Basque categories_Beauty & Spas categories_Beer categories_Belgian categories_Beverage Store categories_Bistros categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_Brewpubs categories_British categories_Bubble Tea categories_Buffets categories_Burgers categories_Butcher categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Cambodian categories_Canadian (New) categories_Cantonese categories_Caribbean categories_Caterers categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Coffee & Tea categories_Coffee Roasteries categories_Colombian categories_Comfort Food categories_Convenience Stores categories_Cooking Classes categories_Creperies categories_Cuban categories_Custom Cakes categories_Day Spas categories_Delicatessen categories_Delis categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Do-It-Yourself Food categories_Donuts categories_Education categories_Ethiopian categories_Ethnic Grocery categories_Event Planning & Services categories_Falafel categories_Farmers Market categories_Fashion categories_Fast Food categories_Filipino categories_Fish & Chips categories_Fondue categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Tours categories_Food Trucks categories_French categories_Furniture Stores categories_Gas Stations categories_Gastropubs categories_Gelato categories_German categories_Gluten-Free categories_Greek categories_Grocery categories_Hair Removal categories_Hair Salons categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Himalayan/Nepalese categories_Home & Garden categories_Home Decor categories_Home Services categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Italian categories_Izakaya categories_Japanese categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Korean categories_Kosher categories_Landmarks & Historical Buildings categories_Laotian categories_Latin American categories_Lebanese categories_Live/Raw Food categories_Local Flavor categories_Local Services categories_Mags categories_Malaysian categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Modern European categories_Mongolian categories_Moroccan categories_Music Venues categories_Nail Salons categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Organic Stores categories_Other categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Party & Event Planning categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Peruvian categories_Pets categories_Pizza categories_Poke categories_Polish categories_Pop-Up Restaurants categories_Portuguese categories_Poutineries categories_Public Markets categories_Public Services & Government categories_Ramen categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Seafood categories_Shanghainese categories_Shaved Ice categories_Shaved Snow categories_Shopping categories_Singaporean categories_Smokehouse categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Specialty Food categories_Sri Lankan categories_Steakhouses categories_Street Vendors categories_Sushi Bars categories_Szechuan categories_Tabletop Games categories_Tacos categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Tex-Mex categories_Thai categories_Tobacco Shops categories_Towing categories_Turkish categories_Ukrainian categories_Vegan categories_Vegetarian categories_Venezuelan categories_Venues & Event Spaces categories_Vietnamese categories_Waffles categories_Wedding Planning categories_Whiskey Bars categories_Wholesale Stores categories_Wineries categories_Wraps
0 hhVr1uH7XaRlbIHI8dYvbA FYhU1fKQ7n11WQ7gcFYOag --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.622302 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 7778.0 51.049673 -114.079977 24 4.0 29.0 26.0 30.0 29.0 27.0 25.0 61.0 51.0 52.0 51.0 58.0 58.0 17.0 70.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.846154 18 0 1 8 4 4 4 13.0 4.222222 9.0 3.798692 11.168551 3.556721 5.00000 3.686094 3.777956 3.857143 3.684951 3.000000 5.000000 3.868171 2.000000 3.542997 5.000000 3.662669 3.749776 5.000000 3.66752 3.000000 5.000000 3.851784 2.000000 3.555679 5.000000 3.678871 3.770170 3.926822 3.676024 2.866472 5.000000 3.867280 2.000000 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 no4Eo4WloZRLwcYZP9gfhg m-p-7WuB85UjsLDaxJXCXA --1UhMGODdWsrMastO9DZw 5 0 0 0 1 0.964784 4.000000 4.000000 4.000000 3.757630 3.808050 3.714375 1 7778.0 51.049673 -114.079977 24 4.0 29.0 26.0 30.0 29.0 27.0 25.0 61.0 51.0 52.0 51.0 58.0 58.0 17.0 70.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.333333 18 0 0 1 1 1 454 3.0 3.333333 3.0 3.231620 2.702970 3.556721 3.79608 4.000000 3.777956 5.000000 2.500000 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 4.000000 3.749776 5.000000 2.50000 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 4.000000 3.770170 5.000000 2.353400 3.788204 3.928912 3.867280 3.767263 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 8QUwzeXeyJ3L15lKfhKLsQ Bsy9F-59sl9OT_bvZNl3hA --1UhMGODdWsrMastO9DZw 1 0 0 0 1 0.871544 3.000000 3.000000 2.990709 2.926486 2.974252 2.874015 0 7778.0 51.049673 -114.079977 24 4.0 29.0 26.0 30.0 29.0 27.0 25.0 61.0 51.0 52.0 51.0 58.0 58.0 17.0 70.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 2.571429 27 0 0 31 3 3 46 14.0 2.692308 13.0 2.692835 12.704916 2.500000 1.00000 3.000000 3.777956 3.200000 3.684951 3.789846 1.000000 2.333333 3.770015 2.500000 1.000000 3.000000 3.749776 3.200000 3.66752 3.771654 1.000000 3.000000 3.744434 2.505665 1.000000 2.990709 3.770170 3.204491 3.676024 3.788204 1.000000 2.974502 3.767263 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 BIecLw546kAlD7kmlk7vXA sTVGcezME7gYBhIlYtcfpg --1UhMGODdWsrMastO9DZw 2 0 0 0 1 0.988395 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 7778.0 51.049673 -114.079977 24 4.0 29.0 26.0 30.0 29.0 27.0 25.0 61.0 51.0 52.0 51.0 58.0 58.0 17.0 70.0 4.227273 22.0 4.214286 14.0 4.267477 16.978214 0.009869 3.703313 2 0 0 1 0 0 622 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.867280 3.767263 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 0nVZ9Cg1A1lVA8EFWbt5lg EisUuXVeVJN_FcFiE-tqwA --6MefnULPED_I942VcFNA 3 2 0 0 1 0.927789 2.966667 2.923077 2.954478 2.985799 2.973742 2.915633 0 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.000000 2.980769 212 3 23 307 157 71 2902 104.0 3.000000 88.0 2.997169 88.020993 2.966667 4.00000 3.686094 2.000000 3.040000 2.600000 3.600000 2.500000 1.000000 3.750000 2.923077 4.000000 3.662669 1.666667 3.146341 2.60000 3.750000 2.500000 1.000000 3.666667 2.954478 4.000000 3.678871 1.826533 3.115334 2.630356 3.621347 2.500236 1.000000 3.828739 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Rule for calculating the number of hidden layers (from here):

$N_h = \frac {N_s} {(\alpha * (N_i + N_o))}$

  • $N_h$ is the number of hidden neurons
  • $N_i$ is the number of input neurons
  • $N_o$ is the number of output neurons
  • $N_s$ number of samples in training data set
  • $\alpha$ an arbitrary scaling factor usually between 5 and 10
In [7]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 6

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
119
In [8]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [9]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 206s 527us/step - loss: 0.6102 - acc: 0.6933 - val_loss: 0.5960 - val_acc: 0.7106
Epoch 2/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5606 - acc: 0.7197 - val_loss: 0.5684 - val_acc: 0.7193
Epoch 3/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5503 - acc: 0.7273 - val_loss: 0.5508 - val_acc: 0.7308
Epoch 4/100
390870/390870 [==============================] - 24s 63us/step - loss: 0.5454 - acc: 0.7317 - val_loss: 0.5547 - val_acc: 0.7287
Epoch 5/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5414 - acc: 0.7337 - val_loss: 0.5375 - val_acc: 0.7373
Epoch 6/100
390870/390870 [==============================] - 25s 63us/step - loss: 0.5393 - acc: 0.7350 - val_loss: 0.5453 - val_acc: 0.7272
Epoch 7/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5381 - acc: 0.7366 - val_loss: 0.5348 - val_acc: 0.7402
Epoch 8/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5354 - acc: 0.7380 - val_loss: 0.5390 - val_acc: 0.7395
Epoch 9/100
390870/390870 [==============================] - 24s 62us/step - loss: 0.5340 - acc: 0.7397 - val_loss: 0.5313 - val_acc: 0.7390
Epoch 10/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5322 - acc: 0.7405 - val_loss: 0.5354 - val_acc: 0.7448
Epoch 11/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5326 - acc: 0.7408 - val_loss: 0.5372 - val_acc: 0.7415
Epoch 12/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5298 - acc: 0.7423 - val_loss: 0.5361 - val_acc: 0.7384
Epoch 13/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5300 - acc: 0.7420 - val_loss: 0.5407 - val_acc: 0.7356
Epoch 14/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5290 - acc: 0.7425 - val_loss: 0.5250 - val_acc: 0.7481
Epoch 15/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5274 - acc: 0.7433 - val_loss: 0.5267 - val_acc: 0.7436
Epoch 16/100
390870/390870 [==============================] - 24s 62us/step - loss: 0.5255 - acc: 0.7449 - val_loss: 0.5217 - val_acc: 0.7494
Epoch 17/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5254 - acc: 0.7451 - val_loss: 0.5313 - val_acc: 0.7460
Epoch 18/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5241 - acc: 0.7460 - val_loss: 0.5246 - val_acc: 0.7471
Epoch 19/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5240 - acc: 0.7460 - val_loss: 0.5237 - val_acc: 0.7465
Epoch 20/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5234 - acc: 0.7461 - val_loss: 0.5250 - val_acc: 0.7481
Epoch 21/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5226 - acc: 0.7462 - val_loss: 0.5417 - val_acc: 0.7359
Epoch 22/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5234 - acc: 0.7465 - val_loss: 0.5258 - val_acc: 0.7474
Epoch 23/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5222 - acc: 0.7472 - val_loss: 0.5217 - val_acc: 0.7499
Epoch 24/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5210 - acc: 0.7478 - val_loss: 0.5267 - val_acc: 0.7442
Epoch 25/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5207 - acc: 0.7484 - val_loss: 0.5455 - val_acc: 0.7354
Epoch 26/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5202 - acc: 0.7486 - val_loss: 0.5288 - val_acc: 0.7449
Epoch 27/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5214 - acc: 0.7476 - val_loss: 0.5456 - val_acc: 0.7338
Epoch 28/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5209 - acc: 0.7484 - val_loss: 0.5258 - val_acc: 0.7468
Epoch 29/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5204 - acc: 0.7472 - val_loss: 0.5231 - val_acc: 0.7481
Epoch 30/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5190 - acc: 0.7493 - val_loss: 0.5230 - val_acc: 0.7463
Epoch 31/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5189 - acc: 0.7490 - val_loss: 0.5250 - val_acc: 0.7480
Epoch 32/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5205 - acc: 0.7482 - val_loss: 0.5226 - val_acc: 0.7496
Epoch 33/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5202 - acc: 0.7486 - val_loss: 0.5276 - val_acc: 0.7457
Epoch 34/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5172 - acc: 0.7503 - val_loss: 0.5243 - val_acc: 0.7481
Epoch 35/100
390870/390870 [==============================] - 25s 63us/step - loss: 0.5173 - acc: 0.7503 - val_loss: 0.5219 - val_acc: 0.7481
Epoch 36/100
390870/390870 [==============================] - 24s 62us/step - loss: 0.5174 - acc: 0.7494 - val_loss: 0.5212 - val_acc: 0.7479
Epoch 37/100
390870/390870 [==============================] - 24s 62us/step - loss: 0.5172 - acc: 0.7500 - val_loss: 0.5314 - val_acc: 0.7419
Epoch 38/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5190 - acc: 0.7495 - val_loss: 0.5206 - val_acc: 0.7498
Epoch 39/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5170 - acc: 0.7502 - val_loss: 0.5233 - val_acc: 0.7471
Epoch 40/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5176 - acc: 0.7498 - val_loss: 0.5207 - val_acc: 0.7506
Epoch 41/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5187 - acc: 0.7491 - val_loss: 0.5242 - val_acc: 0.7474
Epoch 42/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5160 - acc: 0.7508 - val_loss: 0.5182 - val_acc: 0.7502
Epoch 43/100
390870/390870 [==============================] - 24s 62us/step - loss: 0.5154 - acc: 0.7512 - val_loss: 0.5214 - val_acc: 0.7492
Epoch 44/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5152 - acc: 0.7511 - val_loss: 0.5259 - val_acc: 0.7432
Epoch 45/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5155 - acc: 0.7511 - val_loss: 0.5198 - val_acc: 0.7497
Epoch 46/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5165 - acc: 0.7506 - val_loss: 0.5294 - val_acc: 0.7464
Epoch 47/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5163 - acc: 0.7510 - val_loss: 0.5256 - val_acc: 0.7462
Epoch 48/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5156 - acc: 0.7510 - val_loss: 0.5188 - val_acc: 0.7501
Epoch 49/100
390870/390870 [==============================] - 23s 60us/step - loss: 0.5160 - acc: 0.7499 - val_loss: 0.5257 - val_acc: 0.7473
Epoch 50/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5167 - acc: 0.7518 - val_loss: 0.5215 - val_acc: 0.7491
Epoch 51/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5146 - acc: 0.7513 - val_loss: 0.5201 - val_acc: 0.7514
Epoch 52/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5137 - acc: 0.7524 - val_loss: 0.5248 - val_acc: 0.7475
Epoch 53/100
390870/390870 [==============================] - 23s 60us/step - loss: 0.5141 - acc: 0.7523 - val_loss: 0.5217 - val_acc: 0.7475
Epoch 54/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5132 - acc: 0.7525 - val_loss: 0.5188 - val_acc: 0.7510
Epoch 55/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5134 - acc: 0.7523 - val_loss: 0.5183 - val_acc: 0.7523
Epoch 56/100
390870/390870 [==============================] - 23s 60us/step - loss: 0.5128 - acc: 0.7526 - val_loss: 0.5206 - val_acc: 0.7486
Epoch 57/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5130 - acc: 0.7525 - val_loss: 0.5402 - val_acc: 0.7370
Epoch 58/100
390870/390870 [==============================] - 24s 60us/step - loss: 0.5136 - acc: 0.7521 - val_loss: 0.5204 - val_acc: 0.7499
Epoch 59/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5135 - acc: 0.7517 - val_loss: 0.5186 - val_acc: 0.7510
Epoch 60/100
390870/390870 [==============================] - 24s 62us/step - loss: 0.5132 - acc: 0.7522 - val_loss: 0.5194 - val_acc: 0.7507
Epoch 61/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5131 - acc: 0.7523 - val_loss: 0.5266 - val_acc: 0.7503
Epoch 62/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5115 - acc: 0.7538 - val_loss: 0.5164 - val_acc: 0.7518
Epoch 63/100
390870/390870 [==============================] - 24s 61us/step - loss: 0.5140 - acc: 0.7522 - val_loss: 0.5255 - val_acc: 0.7511
Epoch 64/100
390870/390870 [==============================] - 24s 62us/step - loss: 0.5138 - acc: 0.7515 - val_loss: 0.5199 - val_acc: 0.7496
Epoch 65/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5141 - acc: 0.7517 - val_loss: 0.5218 - val_acc: 0.7502
Epoch 66/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5135 - acc: 0.7525 - val_loss: 0.5220 - val_acc: 0.7477
Epoch 67/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5152 - acc: 0.7519 - val_loss: 0.5243 - val_acc: 0.7497
Epoch 68/100
390870/390870 [==============================] - 25s 63us/step - loss: 0.5137 - acc: 0.7518 - val_loss: 0.5184 - val_acc: 0.7512
Epoch 69/100
390870/390870 [==============================] - 25s 63us/step - loss: 0.5174 - acc: 0.7515 - val_loss: 0.5292 - val_acc: 0.7474
Epoch 70/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5138 - acc: 0.7516 - val_loss: 0.5265 - val_acc: 0.7449
Epoch 71/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5127 - acc: 0.7534 - val_loss: 0.5263 - val_acc: 0.7479
Epoch 72/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5125 - acc: 0.7533 - val_loss: 0.5214 - val_acc: 0.7481
Epoch 73/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5119 - acc: 0.7531 - val_loss: 0.5209 - val_acc: 0.7503
Epoch 74/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5123 - acc: 0.7526 - val_loss: 0.5200 - val_acc: 0.7488
Epoch 75/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5109 - acc: 0.7536 - val_loss: 0.5184 - val_acc: 0.7513
Epoch 76/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5133 - acc: 0.7517 - val_loss: 0.5296 - val_acc: 0.7484
Epoch 77/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5128 - acc: 0.7520 - val_loss: 0.5254 - val_acc: 0.7467
Epoch 78/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5124 - acc: 0.7530 - val_loss: 0.5206 - val_acc: 0.7495
Epoch 79/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5120 - acc: 0.7534 - val_loss: 0.5209 - val_acc: 0.7489
Epoch 80/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5106 - acc: 0.7540 - val_loss: 0.5274 - val_acc: 0.7467
Epoch 81/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5137 - acc: 0.7519 - val_loss: 0.5233 - val_acc: 0.7467
Epoch 82/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5141 - acc: 0.7507 - val_loss: 0.5208 - val_acc: 0.7471
Epoch 83/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5109 - acc: 0.7538 - val_loss: 0.5199 - val_acc: 0.7525
Epoch 84/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5109 - acc: 0.7531 - val_loss: 0.5244 - val_acc: 0.7486
Epoch 85/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5119 - acc: 0.7528 - val_loss: 0.5202 - val_acc: 0.7500
Epoch 86/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5106 - acc: 0.7541 - val_loss: 0.5189 - val_acc: 0.7506
Epoch 87/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5110 - acc: 0.7535 - val_loss: 0.5202 - val_acc: 0.7517
Epoch 88/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5114 - acc: 0.7532 - val_loss: 0.5246 - val_acc: 0.7474
Epoch 89/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5105 - acc: 0.7543 - val_loss: 0.5199 - val_acc: 0.7481
Epoch 90/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5123 - acc: 0.7527 - val_loss: 0.5172 - val_acc: 0.7505
Epoch 91/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5119 - acc: 0.7523 - val_loss: 0.5270 - val_acc: 0.7457
Epoch 92/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5116 - acc: 0.7524 - val_loss: 0.5169 - val_acc: 0.7524
Epoch 93/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5094 - acc: 0.7547 - val_loss: 0.5164 - val_acc: 0.7524
Epoch 94/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5084 - acc: 0.7551 - val_loss: 0.5171 - val_acc: 0.7507
Epoch 95/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5084 - acc: 0.7552 - val_loss: 0.5224 - val_acc: 0.7516
Epoch 96/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5106 - acc: 0.7539 - val_loss: 0.5236 - val_acc: 0.7490
Epoch 97/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5095 - acc: 0.7540 - val_loss: 0.5454 - val_acc: 0.7475
Epoch 98/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5096 - acc: 0.7538 - val_loss: 0.5237 - val_acc: 0.7514
Epoch 99/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5099 - acc: 0.7541 - val_loss: 0.5264 - val_acc: 0.7490
Epoch 100/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5098 - acc: 0.7541 - val_loss: 0.5324 - val_acc: 0.7484
Out[9]:
<keras.callbacks.History at 0x234bf5d0908>
In [10]:
# Save trained model
classifier.save("../models/trained_deep_neural_network.h5")
In [11]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 16s 29us/step
[0.516120845819726, 0.7530937380230908]

The neural network gives us the probability that the class is 1, we binarize with threshold to 0.5 thus having the labels with probability up to 0.5 at 0, the others at 1. We could use two nerones in output thus having a probability vector expressing the probability that it is 1 and the probability that it is 0, but only an equivalent approach would result.

In [12]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set.head()
Out[12]:
review_id user_id business_id stars_review useful_review funny_review cool_review bin_truth_score real_truth_score cuisine_av_hist cuisine_av_hist_bin cuisine_av_hist_real coll_score coll_score_bin coll_score_real likes postal_code latitude longitude review_count stars_restaurant Monday_Open Tuesday_Open Wednesday_Open Thursday_Open Friday_Open Saturday_Open Sunday_Open Monday_Close Tuesday_Close Wednesday_Close Thursday_Close Friday_Close Saturday_Close Sunday_Close average_stars_review num_reviews_review average_stars_bin_review num_reviews_bin_review average_stars_real_review num_reviews_real_review compliment_count average_stars_user review years_of_elite fans useful_user cool_user funny_user friends num_reviews_user average_stars_bin_user num_reviews_bin_user average_stars_real_user num_reviews_real_user av_rat_chinese_cuisine av_rat_japanese_cuisine av_rat_mexican_cuisine av_rat_italian_cuisine av_rat_others_cuisine av_rat_american_cuisine av_rat_korean_cuisine av_rat_mediterranean_cuisine av_rat_thai_cuisine av_rat_asianfusion_cuisine av_rat_chinese_cuisine_bin av_rat_japanese_cuisine_bin av_rat_mexican_cuisine_bin av_rat_italian_cuisine_bin av_rat_others_cuisine_bin av_rat_american_cuisine_bin av_rat_korean_cuisine_bin av_rat_mediterranean_cuisine_bin av_rat_thai_cuisine_bin av_rat_asianfusion_cuisine_bin av_rat_chinese_cuisine_real av_rat_japanese_cuisine_real av_rat_mexican_cuisine_real av_rat_italian_cuisine_real av_rat_others_cuisine_real av_rat_american_cuisine_real av_rat_korean_cuisine_real av_rat_mediterranean_cuisine_real av_rat_thai_cuisine_real av_rat_asianfusion_cuisine_real OutdoorSeating_False OutdoorSeating_None OutdoorSeating_True BusinessAcceptsCreditCards_False BusinessAcceptsCreditCards_None BusinessAcceptsCreditCards_True RestaurantsDelivery_False RestaurantsDelivery_None RestaurantsDelivery_True RestaurantsReservations_False RestaurantsReservations_None RestaurantsReservations_True WiFi_Free WiFi_No WiFi_None WiFi_Paid Alcohol_Beer&Wine Alcohol_Full_Bar Alcohol_No Alcohol_None city_Ahwatukee city_Airdrie city_Ajax city_Akron city_Allison Park city_Amherst city_Aurora city_Avon city_Avondale city_Beachwood city_Bellevue city_Belmont city_Berea city_Bethel Park city_Blue Diamond city_Boulder City city_Braddock city_Brampton city_Brecksville city_Bridgeville city_Broadview Heights city_Brook Park city_Brooklyn city_Brossard city_Brunswick city_Buckeye city_Calgary city_Canonsburg city_Carefree city_Carnegie city_Cave Creek city_Chagrin Falls city_Champaign city_Chandler city_Chardon city_Charlotte city_Chesterland city_Cleveland city_Clover city_Concord city_Coraopolis city_Cornelius city_Cuyahoga Falls city_Davidson city_Denver city_Dollard-des-Ormeaux city_Dorval city_East York city_El Mirage city_Elyria city_Etobicoke city_Euclid city_Fairlawn city_Fairview Park city_Fitchburg city_Fort Mill city_Fountain Hills city_Gastonia city_Gilbert city_Glendale city_Goodyear city_Harrisburg city_Henderson city_Highland Heights city_Homestead city_Hudson city_Huntersville city_Independence city_Indian Land city_Indian Trail city_Irwin city_Kannapolis city_Kent city_Lake Wylie city_Lakewood city_Las Vegas city_Laval city_Laveen city_Litchfield Park city_Longueuil city_Lorain city_Lyndhurst city_Macedonia city_Madison city_Maple city_Markham city_Matthews city_Mayfield Heights city_McKees Rocks city_McMurray city_Medina city_Mentor city_Mesa city_Middleburg Heights city_Middleton city_Mint Hill city_Mississauga city_Monona city_Monroe city_Monroeville city_Montreal city_Montréal city_Moon Township city_Mooresville city_Mount Holly city_Murrysville city_New Kensington city_Newmarket city_North Las Vegas city_North Olmsted city_North Ridgeville city_North Royalton city_North York city_Northfield city_Oakmont city_Oakville city_Olmsted Falls city_Orange city_Orange Village city_Other city_Painesville city_Paradise Valley city_Parma city_Peoria city_Phoenix city_Pickering city_Pineville city_Pittsburgh city_Pointe-Claire city_Queen Creek city_Richmond Hill city_Rock Hill city_Rocky River city_Saint-Laurent city_Scarborough city_Scottsdale city_Seven Hills city_Sewickley city_Solon city_South Euclid city_South Las Vegas city_Spring Valley city_Stoughton city_Stow city_Streetsboro city_Strongsville city_Sun City city_Sun Prairie city_Surprise city_Tega Cay city_Tempe city_Thornhill city_Tolleson city_Toronto city_Twinsburg city_Unionville city_University Heights city_Upper Saint Clair city_Urbana city_Valley View city_Vaughan city_Verdun city_Verona city_Warrensville Heights city_Waunakee city_Waxhaw city_West Mifflin city_Westlake city_Westmount city_Wexford city_Whitby city_Willoughby city_Woodbridge city_Woodmere city_York categories_ Acai Bowls categories_ Active Life categories_ Adult Entertainment categories_ Afghan categories_ African categories_ Airports categories_ American (New) categories_ American (Traditional) categories_ Amusement Parks categories_ Animal Shelters categories_ Antiques categories_ Appliances categories_ Arabian categories_ Arcades categories_ Argentine categories_ Armenian categories_ Art Galleries categories_ Arts & Crafts categories_ Arts & Entertainment categories_ Asian Fusion categories_ Automotive categories_ Bagels categories_ Bakeries categories_ Bangladeshi categories_ Barbeque categories_ Barbers categories_ Bars categories_ Basque categories_ Beauty & Spas categories_ Bed & Breakfast categories_ Beer categories_ Belgian categories_ Beverage Store categories_ Bistros categories_ Books categories_ Botanical Gardens categories_ Bowling categories_ Brasseries categories_ Brazilian categories_ Breakfast & Brunch categories_ Breweries categories_ Brewpubs categories_ British categories_ Bubble Tea categories_ Buffets categories_ Burgers categories_ Butcher categories_ Cafes categories_ Cafeteria categories_ Cajun/Creole categories_ Cambodian categories_ Canadian (New) categories_ Candy Stores categories_ Cantonese categories_ Car Wash categories_ Caribbean categories_ Casinos categories_ Caterers categories_ Cheese Shops categories_ Cheesesteaks categories_ Chicken Shop categories_ Chicken Wings categories_ Chinese categories_ Chocolatiers & Shops categories_ Cinema categories_ Cocktail Bars categories_ Coffee & Tea categories_ Coffee Roasteries categories_ Colombian categories_ Comfort Food categories_ Community Service/Non-Profit categories_ Convenience Stores categories_ Conveyor Belt Sushi categories_ Cooking Classes categories_ Country Dance Halls categories_ Creperies categories_ Cuban categories_ Cupcakes categories_ Custom Cakes categories_ Dance Clubs categories_ Day Spas categories_ Delicatessen categories_ Delis categories_ Department Stores categories_ Desserts categories_ Dim Sum categories_ Diners categories_ Dinner Theater categories_ Distilleries categories_ Dive Bars categories_ Do-It-Yourself Food categories_ Dominican categories_ Donairs categories_ Donuts categories_ Drugstores categories_ Eatertainment categories_ Education categories_ Egyptian categories_ Empanadas categories_ Employment Agencies categories_ Ethiopian categories_ Ethnic Food categories_ Ethnic Grocery categories_ Event Planning & Services categories_ Eyelash Service categories_ Falafel categories_ Farmers Market categories_ Fashion categories_ Fast Food categories_ Festivals categories_ Filipino categories_ Fish & Chips categories_ Fitness & Instruction categories_ Florists categories_ Flowers & Gifts categories_ Fondue categories_ Food Court categories_ Food Delivery Services categories_ Food Stands categories_ Food Tours categories_ Food Trucks categories_ French categories_ Fruits & Veggies categories_ Furniture Stores categories_ Gas Stations categories_ Gastropubs categories_ Gay Bars categories_ Gelato categories_ German categories_ Gift Shops categories_ Gluten-Free categories_ Golf categories_ Greek categories_ Grocery categories_ Guamanian categories_ Hair Removal categories_ Hair Salons categories_ Hakka categories_ Halal categories_ Hawaiian categories_ Health & Medical categories_ Health Markets categories_ Himalayan/Nepalese categories_ Home & Garden categories_ Home Decor categories_ Home Services categories_ Hong Kong Style Cafe categories_ Hookah Bars categories_ Hostels categories_ Hot Dogs categories_ Hot Pot categories_ Hotels categories_ Hotels & Travel categories_ Hungarian categories_ Ice Cream & Frozen Yogurt categories_ Imported Food categories_ Indian categories_ Indonesian categories_ International categories_ International Grocery categories_ Internet Cafes categories_ Irish categories_ Italian categories_ Izakaya categories_ Japanese categories_ Jazz & Blues categories_ Juice Bars & Smoothies categories_ Karaoke categories_ Kebab categories_ Kids Activities categories_ Kitchen & Bath categories_ Kombucha categories_ Korean categories_ Kosher categories_ Landmarks & Historical Buildings categories_ Laotian categories_ Latin American categories_ Lebanese categories_ Live/Raw Food categories_ Local Flavor categories_ Local Services categories_ Lounges categories_ Macarons categories_ Mags categories_ Malaysian categories_ Massage categories_ Meat Shops categories_ Mediterranean categories_ Mexican categories_ Middle Eastern categories_ Mini Golf categories_ Modern European categories_ Mongolian categories_ Moroccan categories_ Museums categories_ Music & Video categories_ Music Venues categories_ Musicians categories_ Nail Salons categories_ Nail Technicians categories_ New Mexican Cuisine categories_ Nightlife categories_ Noodles categories_ Nutritionists categories_ Organic Stores categories_ Outlet Stores categories_ Pakistani categories_ Pan Asian categories_ Party & Event Planning categories_ Pasta Shops categories_ Patisserie/Cake Shop categories_ Performing Arts categories_ Persian/Iranian categories_ Personal Chefs categories_ Personal Shopping categories_ Peruvian categories_ Pets categories_ Piano Bars categories_ Pizza categories_ Playgrounds categories_ Poke categories_ Polish categories_ Pool Halls categories_ Pop-Up Restaurants categories_ Portuguese categories_ Poutineries categories_ Pretzels categories_ Professional Services categories_ Public Markets categories_ Public Services & Government categories_ Pubs categories_ Puerto Rican categories_ Ramen categories_ Real Estate categories_ Resorts categories_ Russian categories_ Salad categories_ Salvadoran categories_ Sandwiches categories_ Scandinavian categories_ Seafood categories_ Shanghainese categories_ Shaved Ice categories_ Shaved Snow categories_ Shopping categories_ Singaporean categories_ Skin Care categories_ Smokehouse categories_ Social Clubs categories_ Soul Food categories_ Soup categories_ Southern categories_ Spanish categories_ Speakeasies categories_ Specialty Food categories_ Sporting Goods categories_ Sports Bars categories_ Sri Lankan categories_ Stadiums & Arenas categories_ Steakhouses categories_ Street Vendors categories_ Sushi Bars categories_ Swimming Pools categories_ Syrian categories_ Szechuan categories_ Tabletop Games categories_ Tacos categories_ Taiwanese categories_ Tapas Bars categories_ Tapas/Small Plates categories_ Tea Rooms categories_ Teppanyaki categories_ Tex-Mex categories_ Thai categories_ Themed Cafes categories_ Tiki Bars categories_ Tobacco Shops categories_ Tours categories_ Towing categories_ Turkish categories_ Tuscan categories_ Ukrainian categories_ Vegan categories_ Vegetarian categories_ Venezuelan categories_ Venues & Event Spaces categories_ Vietnamese categories_ Waffles categories_ Waxing categories_ Wedding Chapels categories_ Wedding Planning categories_ Whiskey Bars categories_ Wholesale Stores categories_ Wine & Spirits categories_ Wine Bars categories_ Wineries categories_ Wraps categories_ Zoos categories_Acai Bowls categories_Active Life categories_Afghan categories_African categories_Airports categories_American (New) categories_American (Traditional) categories_Appliances categories_Arabian categories_Arcades categories_Argentine categories_Armenian categories_Arts & Entertainment categories_Asian Fusion categories_Automotive categories_Bagels categories_Bakeries categories_Bangladeshi categories_Barbeque categories_Bars categories_Basque categories_Beauty & Spas categories_Beer categories_Belgian categories_Beverage Store categories_Bistros categories_Bowling categories_Brasseries categories_Brazilian categories_Breakfast & Brunch categories_Breweries categories_Brewpubs categories_British categories_Bubble Tea categories_Buffets categories_Burgers categories_Butcher categories_Cafes categories_Cafeteria categories_Cajun/Creole categories_Cambodian categories_Canadian (New) categories_Cantonese categories_Caribbean categories_Caterers categories_Cheesesteaks categories_Chicken Shop categories_Chicken Wings categories_Chinese categories_Coffee & Tea categories_Coffee Roasteries categories_Colombian categories_Comfort Food categories_Convenience Stores categories_Cooking Classes categories_Creperies categories_Cuban categories_Custom Cakes categories_Day Spas categories_Delicatessen categories_Delis categories_Desserts categories_Dim Sum categories_Diners categories_Dinner Theater categories_Do-It-Yourself Food categories_Donuts categories_Education categories_Ethiopian categories_Ethnic Grocery categories_Event Planning & Services categories_Falafel categories_Farmers Market categories_Fashion categories_Fast Food categories_Filipino categories_Fish & Chips categories_Fondue categories_Food Court categories_Food Delivery Services categories_Food Stands categories_Food Tours categories_Food Trucks categories_French categories_Furniture Stores categories_Gas Stations categories_Gastropubs categories_Gelato categories_German categories_Gluten-Free categories_Greek categories_Grocery categories_Hair Removal categories_Hair Salons categories_Hakka categories_Halal categories_Hawaiian categories_Health & Medical categories_Himalayan/Nepalese categories_Home & Garden categories_Home Decor categories_Home Services categories_Hot Dogs categories_Hot Pot categories_Hotels categories_Hotels & Travel categories_Hungarian categories_Ice Cream & Frozen Yogurt categories_Imported Food categories_Indian categories_Indonesian categories_International categories_International Grocery categories_Internet Cafes categories_Irish categories_Italian categories_Izakaya categories_Japanese categories_Juice Bars & Smoothies categories_Karaoke categories_Kebab categories_Korean categories_Kosher categories_Landmarks & Historical Buildings categories_Laotian categories_Latin American categories_Lebanese categories_Live/Raw Food categories_Local Flavor categories_Local Services categories_Mags categories_Malaysian categories_Mediterranean categories_Mexican categories_Middle Eastern categories_Modern European categories_Mongolian categories_Moroccan categories_Music Venues categories_Nail Salons categories_New Mexican Cuisine categories_Nightlife categories_Noodles categories_Organic Stores categories_Other categories_Outlet Stores categories_Pakistani categories_Pan Asian categories_Party & Event Planning categories_Patisserie/Cake Shop categories_Performing Arts categories_Persian/Iranian categories_Peruvian categories_Pets categories_Pizza categories_Poke categories_Polish categories_Pop-Up Restaurants categories_Portuguese categories_Poutineries categories_Public Markets categories_Public Services & Government categories_Ramen categories_Russian categories_Salad categories_Salvadoran categories_Sandwiches categories_Seafood categories_Shanghainese categories_Shaved Ice categories_Shaved Snow categories_Shopping categories_Singaporean categories_Smokehouse categories_Social Clubs categories_Soul Food categories_Soup categories_South African categories_Southern categories_Spanish categories_Specialty Food categories_Sri Lankan categories_Steakhouses categories_Street Vendors categories_Sushi Bars categories_Szechuan categories_Tabletop Games categories_Tacos categories_Taiwanese categories_Tapas Bars categories_Tapas/Small Plates categories_Tea Rooms categories_Tex-Mex categories_Thai categories_Tobacco Shops categories_Towing categories_Turkish categories_Ukrainian categories_Vegan categories_Vegetarian categories_Venezuelan categories_Venues & Event Spaces categories_Vietnamese categories_Waffles categories_Wedding Planning categories_Whiskey Bars categories_Wholesale Stores categories_Wineries categories_Wraps
0 -j8YU0f5cL_fbnzsi1zkpA rEsBrt6U7i8O4rC81lV6NQ --6MefnULPED_I942VcFNA 4 0 0 0 1 0.997555 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 67 1 1 16 6 2 3574 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 1vEQsKjTfGCcNdQ2Hhuhew yEP9vNFq3edLldNzhm6hgQ --6MefnULPED_I942VcFNA 5 0 0 0 -1 0.553523 1.800000 2.000000 1.799679 1.838975 2.007105 1.777964 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 2.875000 86 0 1 34 9 12 166 16.0 3.000000 15.0 2.909282 13.563981 1.800000 2.20000 3.686094 4.500000 3.333333 1.000000 3.000000 3.933014 3.868171 3.770015 2.000000 2.500000 3.662669 4.500000 3.333333 1.00000 3.000000 3.904608 3.851784 3.744434 1.799679 2.460018 3.678871 4.571695 3.355656 1.000000 2.924220 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 P6ZgOVwyGlvis4m2Cra13g uH8tTLb3Fz64GtEMWpZZcQ --6MefnULPED_I942VcFNA 5 0 0 0 1 0.990602 4.300000 4.333333 4.299574 4.349620 4.302949 4.288981 1 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 4.108108 227 2 7 99 47 30 286 37.0 4.161290 31.0 4.130733 31.326167 4.300000 3.75000 3.686094 3.500000 4.454545 3.666667 3.800000 3.933014 4.000000 4.000000 4.333333 3.750000 3.662669 3.000000 4.555556 3.50000 4.000000 3.904608 4.000000 4.000000 4.299574 3.724926 3.678871 3.340936 4.538601 3.626374 3.946442 3.928912 4.00000 4.000000 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 ap-_EXUS49YhyukC23p2Gw NQhvEYuYOa5psBxEoNvq2g --6MefnULPED_I942VcFNA 1 0 0 0 1 0.968214 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 0 3267.0 43.841694 -79.399755 44 3.0 29.0 26.0 30.0 29.0 27.0 25.0 29.0 61.0 63.0 62.0 69.0 69.0 68.0 63.0 3.157895 38.0 3.218750 32.0 3.218815 33.948759 0.009869 3.703313 1 0 0 0 0 0 2110 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 tKFDdiQ7rGMsdxgvIz2Sxg zbrH2lnZgWicvwoXR0qcSg --9e1ONYQuAa-CB_Rrw7Tw 5 0 0 0 1 0.995667 3.726715 3.704594 3.719363 3.752465 3.722907 3.760889 1 530.0 36.123183 -115.169190 1613 4.0 47.0 42.0 46.0 45.0 43.0 41.0 46.0 59.0 61.0 60.0 67.0 69.0 68.0 61.0 4.107048 1504.0 4.088816 1216.0 4.092415 1205.738732 0.009869 3.703313 59 0 0 5 0 1 22 0.0 3.703313 0.0 3.703313 0.000000 3.556721 3.79608 3.686094 3.777956 3.736355 3.684951 3.789846 3.933014 3.868171 3.770015 3.542997 3.763461 3.662669 3.749776 3.719581 3.66752 3.771654 3.904608 3.851784 3.744434 3.555679 3.789966 3.678871 3.770170 3.730780 3.676024 3.788204 3.928912 3.86728 3.767263 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In [13]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
In [14]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.69      0.40      0.51     50930
           1       0.75      0.91      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.66      0.67    153993
weighted avg       0.73      0.74      0.72    153993

Accuracy for Deep Learning approach: 74.20402226075211
In [15]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[20454 30476]
 [ 9248 93815]]
In [16]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()

NN 2

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
In [4]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 7

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
102
In [5]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [6]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 36s 93us/step - loss: 0.5886 - acc: 0.6996 - val_loss: 0.5660 - val_acc: 0.7186
Epoch 2/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5544 - acc: 0.7243 - val_loss: 0.5464 - val_acc: 0.7330
Epoch 3/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5462 - acc: 0.7305 - val_loss: 0.5435 - val_acc: 0.7344
Epoch 4/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5417 - acc: 0.7337 - val_loss: 0.5363 - val_acc: 0.7391
Epoch 5/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5386 - acc: 0.7359 - val_loss: 0.5388 - val_acc: 0.7387
Epoch 6/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5362 - acc: 0.7374 - val_loss: 0.5397 - val_acc: 0.7345
Epoch 7/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5345 - acc: 0.7390 - val_loss: 0.5341 - val_acc: 0.7415
Epoch 8/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5333 - acc: 0.7401 - val_loss: 0.5340 - val_acc: 0.7381
Epoch 9/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5320 - acc: 0.7408 - val_loss: 0.5331 - val_acc: 0.7414
Epoch 10/100
390870/390870 [==============================] - 29s 75us/step - loss: 0.5308 - acc: 0.7418 - val_loss: 0.5435 - val_acc: 0.7391
Epoch 11/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5297 - acc: 0.7423 - val_loss: 0.5311 - val_acc: 0.7448
Epoch 12/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5290 - acc: 0.7427 - val_loss: 0.5284 - val_acc: 0.7451
Epoch 13/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5276 - acc: 0.7437 - val_loss: 0.5314 - val_acc: 0.7428
Epoch 14/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5272 - acc: 0.7441 - val_loss: 0.5279 - val_acc: 0.7445
Epoch 15/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5261 - acc: 0.7453 - val_loss: 0.5284 - val_acc: 0.7460
Epoch 16/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5258 - acc: 0.7452 - val_loss: 0.5294 - val_acc: 0.7438
Epoch 17/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5249 - acc: 0.7458 - val_loss: 0.5251 - val_acc: 0.7474
Epoch 18/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5241 - acc: 0.7461 - val_loss: 0.5286 - val_acc: 0.7455
Epoch 19/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5240 - acc: 0.7461 - val_loss: 0.5365 - val_acc: 0.7435
Epoch 20/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5240 - acc: 0.7461 - val_loss: 0.5237 - val_acc: 0.7488
Epoch 21/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5230 - acc: 0.7472 - val_loss: 0.5251 - val_acc: 0.7476
Epoch 22/100
390870/390870 [==============================] - 30s 77us/step - loss: 0.5225 - acc: 0.7470 - val_loss: 0.5434 - val_acc: 0.7404
Epoch 23/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5215 - acc: 0.7472 - val_loss: 0.5368 - val_acc: 0.7448
Epoch 24/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5214 - acc: 0.7477 - val_loss: 0.5253 - val_acc: 0.7475
Epoch 25/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5204 - acc: 0.7487 - val_loss: 0.5236 - val_acc: 0.7473
Epoch 26/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5209 - acc: 0.7479 - val_loss: 0.5234 - val_acc: 0.7487
Epoch 27/100
390870/390870 [==============================] - 30s 76us/step - loss: 0.5207 - acc: 0.7476 - val_loss: 0.5228 - val_acc: 0.7478
Epoch 28/100
390870/390870 [==============================] - 29s 75us/step - loss: 0.5205 - acc: 0.7486 - val_loss: 0.5221 - val_acc: 0.7495
Epoch 29/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5198 - acc: 0.7490 - val_loss: 0.5355 - val_acc: 0.7393
Epoch 30/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5194 - acc: 0.7486 - val_loss: 0.5244 - val_acc: 0.7478
Epoch 31/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5185 - acc: 0.7496 - val_loss: 0.5238 - val_acc: 0.7475
Epoch 32/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5182 - acc: 0.7494 - val_loss: 0.5210 - val_acc: 0.7490
Epoch 33/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5181 - acc: 0.7498 - val_loss: 0.5216 - val_acc: 0.7479
Epoch 34/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5170 - acc: 0.7502 - val_loss: 0.5240 - val_acc: 0.7456
Epoch 35/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5168 - acc: 0.7507 - val_loss: 0.5366 - val_acc: 0.7351
Epoch 36/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5170 - acc: 0.7507 - val_loss: 0.5230 - val_acc: 0.7468
Epoch 37/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5165 - acc: 0.7510 - val_loss: 0.5223 - val_acc: 0.7490
Epoch 38/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5163 - acc: 0.7510 - val_loss: 0.5275 - val_acc: 0.7460
Epoch 39/100
390870/390870 [==============================] - 30s 76us/step - loss: 0.5156 - acc: 0.7512 - val_loss: 0.5281 - val_acc: 0.7455
Epoch 40/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5157 - acc: 0.7510 - val_loss: 0.5248 - val_acc: 0.7486
Epoch 41/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5159 - acc: 0.7511 - val_loss: 0.5222 - val_acc: 0.7483
Epoch 42/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5156 - acc: 0.7510 - val_loss: 0.5300 - val_acc: 0.7438
Epoch 43/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5169 - acc: 0.7499 - val_loss: 0.5327 - val_acc: 0.7407
Epoch 44/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5151 - acc: 0.7513 - val_loss: 0.5237 - val_acc: 0.7489
Epoch 45/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5148 - acc: 0.7512 - val_loss: 0.5250 - val_acc: 0.7479
Epoch 46/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5146 - acc: 0.7514 - val_loss: 0.5200 - val_acc: 0.7504
Epoch 47/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5150 - acc: 0.7520 - val_loss: 0.5191 - val_acc: 0.7482
Epoch 48/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5149 - acc: 0.7515 - val_loss: 0.5229 - val_acc: 0.7511
Epoch 49/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5143 - acc: 0.7518 - val_loss: 0.5303 - val_acc: 0.7425
Epoch 50/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5139 - acc: 0.7525 - val_loss: 0.5248 - val_acc: 0.7477
Epoch 51/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5128 - acc: 0.7530 - val_loss: 0.5198 - val_acc: 0.7497
Epoch 52/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5138 - acc: 0.7517 - val_loss: 0.5212 - val_acc: 0.7498
Epoch 53/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5125 - acc: 0.7527 - val_loss: 0.5240 - val_acc: 0.7475
Epoch 54/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5129 - acc: 0.7526 - val_loss: 0.5239 - val_acc: 0.7477
Epoch 55/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5123 - acc: 0.7532 - val_loss: 0.5188 - val_acc: 0.7519
Epoch 56/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5122 - acc: 0.7531 - val_loss: 0.5258 - val_acc: 0.7484
Epoch 57/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5117 - acc: 0.7531 - val_loss: 0.5245 - val_acc: 0.7498
Epoch 58/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5121 - acc: 0.7530 - val_loss: 0.5228 - val_acc: 0.7505
Epoch 59/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5122 - acc: 0.7530 - val_loss: 0.5191 - val_acc: 0.7513
Epoch 60/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5113 - acc: 0.7537 - val_loss: 0.5171 - val_acc: 0.7529
Epoch 61/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5114 - acc: 0.7529 - val_loss: 0.5181 - val_acc: 0.7525
Epoch 62/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5112 - acc: 0.7539 - val_loss: 0.5203 - val_acc: 0.7495
Epoch 63/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5108 - acc: 0.7538 - val_loss: 0.5390 - val_acc: 0.7358
Epoch 64/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5104 - acc: 0.7538 - val_loss: 0.5204 - val_acc: 0.7518
Epoch 65/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5107 - acc: 0.7542 - val_loss: 0.5220 - val_acc: 0.7520
Epoch 66/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5102 - acc: 0.7547 - val_loss: 0.5327 - val_acc: 0.7485
Epoch 67/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5102 - acc: 0.7541 - val_loss: 0.5219 - val_acc: 0.7509
Epoch 68/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5099 - acc: 0.7545 - val_loss: 0.5195 - val_acc: 0.7510
Epoch 69/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5120 - acc: 0.7540 - val_loss: 0.5261 - val_acc: 0.7479
Epoch 70/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5105 - acc: 0.7543 - val_loss: 0.5225 - val_acc: 0.7490
Epoch 71/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5094 - acc: 0.7546 - val_loss: 0.5209 - val_acc: 0.7513
Epoch 72/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5095 - acc: 0.7550 - val_loss: 0.5199 - val_acc: 0.7501
Epoch 73/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5091 - acc: 0.7546 - val_loss: 0.5214 - val_acc: 0.7514
Epoch 74/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5100 - acc: 0.7540 - val_loss: 0.5497 - val_acc: 0.7407
Epoch 75/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5104 - acc: 0.7537 - val_loss: 0.5188 - val_acc: 0.7525
Epoch 76/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5090 - acc: 0.7547 - val_loss: 0.5196 - val_acc: 0.7532
Epoch 77/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5088 - acc: 0.7549 - val_loss: 0.5288 - val_acc: 0.7488
Epoch 78/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5086 - acc: 0.7548 - val_loss: 0.5233 - val_acc: 0.7520
Epoch 79/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5081 - acc: 0.7555 - val_loss: 0.5286 - val_acc: 0.7513
Epoch 80/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5089 - acc: 0.7548 - val_loss: 0.5277 - val_acc: 0.7494
Epoch 81/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5077 - acc: 0.7555 - val_loss: 0.5408 - val_acc: 0.7459
Epoch 82/100
390870/390870 [==============================] - 29s 75us/step - loss: 0.5085 - acc: 0.7553 - val_loss: 0.5278 - val_acc: 0.7526
Epoch 83/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5077 - acc: 0.7557 - val_loss: 0.5353 - val_acc: 0.7459
Epoch 84/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5072 - acc: 0.7560 - val_loss: 0.5307 - val_acc: 0.7504
Epoch 85/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5080 - acc: 0.7552 - val_loss: 0.5322 - val_acc: 0.7509
Epoch 86/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5071 - acc: 0.7557 - val_loss: 0.5326 - val_acc: 0.7517
Epoch 87/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5076 - acc: 0.7553 - val_loss: 0.5277 - val_acc: 0.7530
Epoch 88/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5075 - acc: 0.7559 - val_loss: 0.5375 - val_acc: 0.7482
Epoch 89/100
390870/390870 [==============================] - 30s 76us/step - loss: 0.5072 - acc: 0.7559 - val_loss: 0.5296 - val_acc: 0.7523
Epoch 90/100
390870/390870 [==============================] - 29s 75us/step - loss: 0.5063 - acc: 0.7566 - val_loss: 0.5287 - val_acc: 0.7526
Epoch 91/100
390870/390870 [==============================] - 30s 76us/step - loss: 0.5075 - acc: 0.7563 - val_loss: 0.5611 - val_acc: 0.7467
Epoch 92/100
390870/390870 [==============================] - 29s 74us/step - loss: 0.5070 - acc: 0.7558 - val_loss: 0.5295 - val_acc: 0.7526
Epoch 93/100
390870/390870 [==============================] - 29s 75us/step - loss: 0.5061 - acc: 0.7562 - val_loss: 0.5373 - val_acc: 0.7495
Epoch 94/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5068 - acc: 0.7561 - val_loss: 0.5299 - val_acc: 0.7536
Epoch 95/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5073 - acc: 0.7557 - val_loss: 0.5322 - val_acc: 0.7515
Epoch 96/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5057 - acc: 0.7571 - val_loss: 0.5339 - val_acc: 0.7524
Epoch 97/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5060 - acc: 0.7569 - val_loss: 0.5323 - val_acc: 0.7510
Epoch 98/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5065 - acc: 0.7561 - val_loss: 0.5350 - val_acc: 0.7505
Epoch 99/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5079 - acc: 0.7552 - val_loss: 0.5403 - val_acc: 0.7478
Epoch 100/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5072 - acc: 0.7562 - val_loss: 0.5282 - val_acc: 0.7527
Out[6]:
<keras.callbacks.History at 0x20c82b579e8>
In [7]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_2.h5")
In [8]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 19s 35us/step
[0.5097503033514922, 0.7573470681567259]
In [9]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
In [10]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
In [11]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.69      0.43      0.53     50930
           1       0.76      0.90      0.83    103063

    accuracy                           0.75    153993
   macro avg       0.72      0.67      0.68    153993
weighted avg       0.74      0.75      0.73    153993

Accuracy for Deep Learning approach: 74.67871916255933
In [12]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[22043 28887]
 [10106 92957]]
In [13]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()

NN 3

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
In [4]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 8

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
89
In [5]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [6]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 86s 220us/step - loss: 0.5889 - acc: 0.6991 - val_loss: 0.5599 - val_acc: 0.7175
Epoch 2/100
390870/390870 [==============================] - 80s 204us/step - loss: 0.5559 - acc: 0.7238 - val_loss: 0.5437 - val_acc: 0.7385
Epoch 3/100
390870/390870 [==============================] - 42s 109us/step - loss: 0.5473 - acc: 0.7302 - val_loss: 0.5678 - val_acc: 0.7257
Epoch 4/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5418 - acc: 0.7337 - val_loss: 0.5395 - val_acc: 0.7423
Epoch 5/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5385 - acc: 0.7362 - val_loss: 0.5365 - val_acc: 0.7395
Epoch 6/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5362 - acc: 0.7380 - val_loss: 0.5420 - val_acc: 0.7350
Epoch 7/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5351 - acc: 0.7386 - val_loss: 0.5403 - val_acc: 0.7368
Epoch 8/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5332 - acc: 0.7402 - val_loss: 0.5290 - val_acc: 0.7457
Epoch 9/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5317 - acc: 0.7414 - val_loss: 0.5361 - val_acc: 0.7418
Epoch 10/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5301 - acc: 0.7423 - val_loss: 0.5308 - val_acc: 0.7427
Epoch 11/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5298 - acc: 0.7424 - val_loss: 0.5374 - val_acc: 0.7386
Epoch 12/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5291 - acc: 0.7425 - val_loss: 0.5390 - val_acc: 0.7376
Epoch 13/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5269 - acc: 0.7444 - val_loss: 0.5296 - val_acc: 0.7429
Epoch 14/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5266 - acc: 0.7443 - val_loss: 0.5293 - val_acc: 0.7448
Epoch 15/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5254 - acc: 0.7455 - val_loss: 0.5251 - val_acc: 0.7473
Epoch 16/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5255 - acc: 0.7458 - val_loss: 0.5277 - val_acc: 0.7464
Epoch 17/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5240 - acc: 0.7467 - val_loss: 0.5280 - val_acc: 0.7478
Epoch 18/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5236 - acc: 0.7464 - val_loss: 0.5277 - val_acc: 0.7440
Epoch 19/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5227 - acc: 0.7474 - val_loss: 0.5282 - val_acc: 0.7450
Epoch 20/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5222 - acc: 0.7473 - val_loss: 0.5260 - val_acc: 0.7470
Epoch 21/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5218 - acc: 0.7481 - val_loss: 0.5247 - val_acc: 0.7461
Epoch 22/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5210 - acc: 0.7483 - val_loss: 0.5341 - val_acc: 0.7410
Epoch 23/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5211 - acc: 0.7482 - val_loss: 0.5279 - val_acc: 0.7444
Epoch 24/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5204 - acc: 0.7487 - val_loss: 0.5208 - val_acc: 0.7501
Epoch 25/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5200 - acc: 0.7488 - val_loss: 0.5300 - val_acc: 0.7450
Epoch 26/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5196 - acc: 0.7489 - val_loss: 0.5303 - val_acc: 0.7449
Epoch 27/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5192 - acc: 0.7493 - val_loss: 0.5473 - val_acc: 0.7363
Epoch 28/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5191 - acc: 0.7493 - val_loss: 0.5274 - val_acc: 0.7452
Epoch 29/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5182 - acc: 0.7502 - val_loss: 0.5204 - val_acc: 0.7489
Epoch 30/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5179 - acc: 0.7502 - val_loss: 0.5221 - val_acc: 0.7478
Epoch 31/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5179 - acc: 0.7500 - val_loss: 0.5256 - val_acc: 0.7455
Epoch 32/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5172 - acc: 0.7505 - val_loss: 0.5320 - val_acc: 0.7485
Epoch 33/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5165 - acc: 0.7511 - val_loss: 0.5248 - val_acc: 0.7486
Epoch 34/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5169 - acc: 0.7506 - val_loss: 0.5360 - val_acc: 0.7463
Epoch 35/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5160 - acc: 0.7514 - val_loss: 0.5229 - val_acc: 0.7469
Epoch 36/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5165 - acc: 0.7505 - val_loss: 0.5240 - val_acc: 0.7481
Epoch 37/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5160 - acc: 0.7511 - val_loss: 0.5360 - val_acc: 0.7463
Epoch 38/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5150 - acc: 0.7519 - val_loss: 0.5299 - val_acc: 0.7511
Epoch 39/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5147 - acc: 0.7517 - val_loss: 0.5296 - val_acc: 0.7458
Epoch 40/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5153 - acc: 0.7510 - val_loss: 0.5226 - val_acc: 0.7505
Epoch 41/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5142 - acc: 0.7523 - val_loss: 0.5264 - val_acc: 0.7489
Epoch 42/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5143 - acc: 0.7515 - val_loss: 0.5521 - val_acc: 0.7451
Epoch 43/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5144 - acc: 0.7522 - val_loss: 0.5299 - val_acc: 0.7487
Epoch 44/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5143 - acc: 0.7518 - val_loss: 0.5351 - val_acc: 0.7491
Epoch 45/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5137 - acc: 0.7525 - val_loss: 0.5300 - val_acc: 0.7493
Epoch 46/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5130 - acc: 0.7529 - val_loss: 0.5297 - val_acc: 0.7479
Epoch 47/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5134 - acc: 0.7528 - val_loss: 0.5308 - val_acc: 0.7460
Epoch 48/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5125 - acc: 0.7529 - val_loss: 0.5525 - val_acc: 0.7475
Epoch 49/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5125 - acc: 0.7532 - val_loss: 0.5977 - val_acc: 0.7437
Epoch 50/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5134 - acc: 0.7524 - val_loss: 0.5662 - val_acc: 0.7491
Epoch 51/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5125 - acc: 0.7535 - val_loss: 0.5283 - val_acc: 0.7511
Epoch 52/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5118 - acc: 0.7534 - val_loss: 0.5900 - val_acc: 0.7467
Epoch 53/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5117 - acc: 0.7534 - val_loss: 0.5814 - val_acc: 0.7486
Epoch 54/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5113 - acc: 0.7539 - val_loss: 0.5490 - val_acc: 0.7513
Epoch 55/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5109 - acc: 0.7541 - val_loss: 0.6131 - val_acc: 0.7465
Epoch 56/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5114 - acc: 0.7540 - val_loss: 0.6025 - val_acc: 0.7458
Epoch 57/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5107 - acc: 0.7537 - val_loss: 0.5912 - val_acc: 0.7473
Epoch 58/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5101 - acc: 0.7542 - val_loss: 0.6068 - val_acc: 0.7470
Epoch 59/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5125 - acc: 0.7535 - val_loss: 0.5276 - val_acc: 0.7503
Epoch 60/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5103 - acc: 0.7543 - val_loss: 0.5908 - val_acc: 0.7498
Epoch 61/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5100 - acc: 0.7544 - val_loss: 0.5929 - val_acc: 0.7458
Epoch 62/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5102 - acc: 0.7544 - val_loss: 0.5844 - val_acc: 0.7492
Epoch 63/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5099 - acc: 0.7542 - val_loss: 0.5892 - val_acc: 0.7468
Epoch 64/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5092 - acc: 0.7546 - val_loss: 0.5878 - val_acc: 0.7459
Epoch 65/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5099 - acc: 0.7552 - val_loss: 0.5823 - val_acc: 0.7465
Epoch 66/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5098 - acc: 0.7549 - val_loss: 0.5856 - val_acc: 0.7455
Epoch 67/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5095 - acc: 0.7547 - val_loss: 0.5800 - val_acc: 0.7489
Epoch 68/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5095 - acc: 0.7548 - val_loss: 0.6077 - val_acc: 0.7485
Epoch 69/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5085 - acc: 0.7552 - val_loss: 0.5824 - val_acc: 0.7474
Epoch 70/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5090 - acc: 0.7548 - val_loss: 0.5980 - val_acc: 0.7468
Epoch 71/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5091 - acc: 0.7549 - val_loss: 0.5952 - val_acc: 0.7473
Epoch 72/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5081 - acc: 0.7557 - val_loss: 0.6037 - val_acc: 0.7467
Epoch 73/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5097 - acc: 0.7551 - val_loss: 0.6016 - val_acc: 0.7459
Epoch 74/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5102 - acc: 0.7550 - val_loss: 0.5877 - val_acc: 0.7470
Epoch 75/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5084 - acc: 0.7553 - val_loss: 0.6011 - val_acc: 0.7427
Epoch 76/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5086 - acc: 0.7553 - val_loss: 0.5909 - val_acc: 0.7465
Epoch 77/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5080 - acc: 0.7553 - val_loss: 0.5588 - val_acc: 0.7485
Epoch 78/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5075 - acc: 0.7563 - val_loss: 0.5866 - val_acc: 0.7496
Epoch 79/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5079 - acc: 0.7554 - val_loss: 0.6126 - val_acc: 0.7452
Epoch 80/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5079 - acc: 0.7562 - val_loss: 0.5867 - val_acc: 0.7461
Epoch 81/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5076 - acc: 0.7558 - val_loss: 0.5865 - val_acc: 0.7499
Epoch 82/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5071 - acc: 0.7559 - val_loss: 0.5533 - val_acc: 0.7494
Epoch 83/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5078 - acc: 0.7559 - val_loss: 0.6015 - val_acc: 0.7485
Epoch 84/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5075 - acc: 0.7561 - val_loss: 0.5866 - val_acc: 0.7458
Epoch 85/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5075 - acc: 0.7557 - val_loss: 0.5920 - val_acc: 0.7487
Epoch 86/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5074 - acc: 0.7558 - val_loss: 0.5795 - val_acc: 0.7513
Epoch 87/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5064 - acc: 0.7567 - val_loss: 0.5217 - val_acc: 0.7531
Epoch 88/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5067 - acc: 0.7561 - val_loss: 0.5314 - val_acc: 0.7516
Epoch 89/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5061 - acc: 0.7565 - val_loss: 0.5377 - val_acc: 0.7501
Epoch 90/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5064 - acc: 0.7569 - val_loss: 0.5567 - val_acc: 0.7504
Epoch 91/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5060 - acc: 0.7568 - val_loss: 0.5890 - val_acc: 0.7495
Epoch 92/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5062 - acc: 0.7565 - val_loss: 0.5978 - val_acc: 0.7496
Epoch 93/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5063 - acc: 0.7569 - val_loss: 0.5849 - val_acc: 0.7479
Epoch 94/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5066 - acc: 0.7567 - val_loss: 0.5656 - val_acc: 0.7522
Epoch 95/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5055 - acc: 0.7570 - val_loss: 0.5727 - val_acc: 0.7496
Epoch 96/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5065 - acc: 0.7563 - val_loss: 0.5764 - val_acc: 0.7490
Epoch 97/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5064 - acc: 0.7568 - val_loss: 0.5726 - val_acc: 0.7529
Epoch 98/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5057 - acc: 0.7573 - val_loss: 0.5820 - val_acc: 0.7494
Epoch 99/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5068 - acc: 0.7562 - val_loss: 0.5329 - val_acc: 0.7543
Epoch 100/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5062 - acc: 0.7571 - val_loss: 0.5933 - val_acc: 0.7459
Out[6]:
<keras.callbacks.History at 0x209bf585a20>
In [7]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_3.h5")
In [8]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 18s 31us/step
[0.5327735302538145, 0.7514568775011509]
In [9]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
In [10]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
In [11]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.75      0.31      0.44     50930
           1       0.74      0.95      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.75      0.63      0.63    153993
weighted avg       0.74      0.74      0.70    153993

Accuracy for Deep Learning approach: 73.78582143344178
In [12]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[15639 35291]
 [ 5077 97986]]
In [13]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [14]:
_del_all()

NN 4

In [15]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
In [16]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 8

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
89
In [17]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Sixth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [18]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5898 - acc: 0.6983 - val_loss: 0.5598 - val_acc: 0.7252
Epoch 2/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5545 - acc: 0.7250 - val_loss: 0.5465 - val_acc: 0.7327
Epoch 3/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5461 - acc: 0.7305 - val_loss: 0.5469 - val_acc: 0.7400
Epoch 4/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5418 - acc: 0.7335 - val_loss: 0.5371 - val_acc: 0.7366
Epoch 5/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5390 - acc: 0.7357 - val_loss: 0.5364 - val_acc: 0.7400
Epoch 6/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5367 - acc: 0.7374 - val_loss: 0.5425 - val_acc: 0.7357
Epoch 7/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5349 - acc: 0.7387 - val_loss: 0.5291 - val_acc: 0.7436
Epoch 8/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5325 - acc: 0.7404 - val_loss: 0.5406 - val_acc: 0.7351
Epoch 9/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5315 - acc: 0.7407 - val_loss: 0.5321 - val_acc: 0.7419
Epoch 10/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5297 - acc: 0.7426 - val_loss: 0.5471 - val_acc: 0.7369
Epoch 11/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5290 - acc: 0.7426 - val_loss: 0.5304 - val_acc: 0.7417
Epoch 12/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5283 - acc: 0.7427 - val_loss: 0.5318 - val_acc: 0.7437
Epoch 13/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5276 - acc: 0.7440 - val_loss: 0.5271 - val_acc: 0.7466
Epoch 14/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5264 - acc: 0.7448 - val_loss: 0.5311 - val_acc: 0.7446
Epoch 15/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5254 - acc: 0.7448 - val_loss: 0.5278 - val_acc: 0.7474
Epoch 16/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5251 - acc: 0.7452 - val_loss: 0.5250 - val_acc: 0.7486
Epoch 17/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5242 - acc: 0.7460 - val_loss: 0.5246 - val_acc: 0.7486
Epoch 18/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5229 - acc: 0.7475 - val_loss: 0.5234 - val_acc: 0.7495
Epoch 19/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5225 - acc: 0.7470 - val_loss: 0.5234 - val_acc: 0.7487
Epoch 20/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5224 - acc: 0.7473 - val_loss: 0.5548 - val_acc: 0.7346
Epoch 21/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5216 - acc: 0.7474 - val_loss: 0.5364 - val_acc: 0.7386
Epoch 22/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5214 - acc: 0.7478 - val_loss: 0.5233 - val_acc: 0.7467
Epoch 23/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5210 - acc: 0.7483 - val_loss: 0.5213 - val_acc: 0.7495
Epoch 24/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5201 - acc: 0.7489 - val_loss: 0.5240 - val_acc: 0.7486
Epoch 25/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5194 - acc: 0.7488 - val_loss: 0.5228 - val_acc: 0.7492
Epoch 26/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5182 - acc: 0.7494 - val_loss: 0.5199 - val_acc: 0.7498
Epoch 27/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5180 - acc: 0.7495 - val_loss: 0.5270 - val_acc: 0.7470
Epoch 28/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5185 - acc: 0.7491 - val_loss: 0.5269 - val_acc: 0.7466
Epoch 29/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5177 - acc: 0.7499 - val_loss: 0.5202 - val_acc: 0.7481
Epoch 30/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5168 - acc: 0.7499 - val_loss: 0.5275 - val_acc: 0.7441
Epoch 31/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5167 - acc: 0.7501 - val_loss: 0.5252 - val_acc: 0.7460
Epoch 32/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5163 - acc: 0.7510 - val_loss: 0.5395 - val_acc: 0.7435
Epoch 33/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5161 - acc: 0.7508 - val_loss: 0.5297 - val_acc: 0.7454
Epoch 34/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5165 - acc: 0.7509 - val_loss: 0.5225 - val_acc: 0.7504
Epoch 35/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5154 - acc: 0.7511 - val_loss: 0.5214 - val_acc: 0.7497
Epoch 36/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5149 - acc: 0.7516 - val_loss: 0.5183 - val_acc: 0.7519
Epoch 37/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5157 - acc: 0.7517 - val_loss: 0.5268 - val_acc: 0.7462
Epoch 38/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5141 - acc: 0.7521 - val_loss: 0.5173 - val_acc: 0.7523
Epoch 39/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5141 - acc: 0.7523 - val_loss: 0.5186 - val_acc: 0.7506
Epoch 40/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5143 - acc: 0.7517 - val_loss: 0.5218 - val_acc: 0.7516
Epoch 41/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5143 - acc: 0.7524 - val_loss: 0.5245 - val_acc: 0.7493
Epoch 42/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5132 - acc: 0.7526 - val_loss: 0.5260 - val_acc: 0.7487
Epoch 43/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5131 - acc: 0.7527 - val_loss: 0.5201 - val_acc: 0.7504
Epoch 44/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5132 - acc: 0.7531 - val_loss: 0.5354 - val_acc: 0.7408
Epoch 45/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5131 - acc: 0.7529 - val_loss: 0.5222 - val_acc: 0.7517
Epoch 46/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5124 - acc: 0.7536 - val_loss: 0.5339 - val_acc: 0.7439
Epoch 47/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5122 - acc: 0.7536 - val_loss: 0.5231 - val_acc: 0.7508
Epoch 48/100
390870/390870 [==============================] - 30s 77us/step - loss: 0.5116 - acc: 0.7534 - val_loss: 0.5285 - val_acc: 0.7508
Epoch 49/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5121 - acc: 0.7542 - val_loss: 0.5260 - val_acc: 0.7487
Epoch 50/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5114 - acc: 0.7538 - val_loss: 0.5295 - val_acc: 0.7500
Epoch 51/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5112 - acc: 0.7542 - val_loss: 0.5229 - val_acc: 0.7492
Epoch 52/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5108 - acc: 0.7535 - val_loss: 0.5436 - val_acc: 0.7470
Epoch 53/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5105 - acc: 0.7542 - val_loss: 0.5274 - val_acc: 0.7512
Epoch 54/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5107 - acc: 0.7547 - val_loss: 0.5283 - val_acc: 0.7512
Epoch 55/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5099 - acc: 0.7543 - val_loss: 0.5323 - val_acc: 0.7503
Epoch 56/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5102 - acc: 0.7545 - val_loss: 0.5280 - val_acc: 0.7497
Epoch 57/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5105 - acc: 0.7538 - val_loss: 0.5201 - val_acc: 0.7511
Epoch 58/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5096 - acc: 0.7543 - val_loss: 0.5202 - val_acc: 0.7525
Epoch 59/100
390870/390870 [==============================] - 30s 78us/step - loss: 0.5103 - acc: 0.7541 - val_loss: 0.5213 - val_acc: 0.7499
Epoch 60/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5098 - acc: 0.7546 - val_loss: 0.5211 - val_acc: 0.7524
Epoch 61/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5091 - acc: 0.7549 - val_loss: 0.5230 - val_acc: 0.7530
Epoch 62/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5095 - acc: 0.7550 - val_loss: 0.5298 - val_acc: 0.7495
Epoch 63/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5094 - acc: 0.7550 - val_loss: 0.5252 - val_acc: 0.7515
Epoch 64/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5096 - acc: 0.7550 - val_loss: 0.5241 - val_acc: 0.7513
Epoch 65/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5093 - acc: 0.7556 - val_loss: 0.5284 - val_acc: 0.7477
Epoch 66/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5084 - acc: 0.7558 - val_loss: 0.5233 - val_acc: 0.7527
Epoch 67/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5080 - acc: 0.7558 - val_loss: 0.5339 - val_acc: 0.7475
Epoch 68/100
390870/390870 [==============================] - 28s 73us/step - loss: 0.5081 - acc: 0.7553 - val_loss: 0.5362 - val_acc: 0.7507
Epoch 69/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5075 - acc: 0.7560 - val_loss: 0.5286 - val_acc: 0.7505
Epoch 70/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5081 - acc: 0.7555 - val_loss: 0.5277 - val_acc: 0.7513
Epoch 71/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5080 - acc: 0.7554 - val_loss: 0.5272 - val_acc: 0.7519
Epoch 72/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5080 - acc: 0.7558 - val_loss: 0.5238 - val_acc: 0.7527
Epoch 73/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5073 - acc: 0.7561 - val_loss: 0.5253 - val_acc: 0.7505
Epoch 74/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5075 - acc: 0.7557 - val_loss: 0.5201 - val_acc: 0.7533
Epoch 75/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5071 - acc: 0.7563 - val_loss: 0.5419 - val_acc: 0.7488
Epoch 76/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5077 - acc: 0.7563 - val_loss: 0.5243 - val_acc: 0.7527
Epoch 77/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5071 - acc: 0.7565 - val_loss: 0.5257 - val_acc: 0.7501
Epoch 78/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5071 - acc: 0.7559 - val_loss: 0.5348 - val_acc: 0.7487
Epoch 79/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5078 - acc: 0.7558 - val_loss: 0.5285 - val_acc: 0.7524
Epoch 80/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5066 - acc: 0.7567 - val_loss: 0.5261 - val_acc: 0.7530
Epoch 81/100
390870/390870 [==============================] - 28s 70us/step - loss: 0.5072 - acc: 0.7567 - val_loss: 0.5288 - val_acc: 0.7513
Epoch 82/100
390870/390870 [==============================] - 28s 70us/step - loss: 0.5060 - acc: 0.7572 - val_loss: 0.5311 - val_acc: 0.7516
Epoch 83/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5064 - acc: 0.7565 - val_loss: 0.5298 - val_acc: 0.7528
Epoch 84/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5066 - acc: 0.7563 - val_loss: 0.5271 - val_acc: 0.7517
Epoch 85/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5062 - acc: 0.7566 - val_loss: 0.5382 - val_acc: 0.7484
Epoch 86/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5065 - acc: 0.7566 - val_loss: 0.5254 - val_acc: 0.7529
Epoch 87/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5064 - acc: 0.7570 - val_loss: 0.5450 - val_acc: 0.7492
Epoch 88/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5056 - acc: 0.7574 - val_loss: 0.5294 - val_acc: 0.7522
Epoch 89/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5057 - acc: 0.7574 - val_loss: 0.5288 - val_acc: 0.7514
Epoch 90/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5058 - acc: 0.7570 - val_loss: 0.5240 - val_acc: 0.7525
Epoch 91/100
390870/390870 [==============================] - 29s 75us/step - loss: 0.5058 - acc: 0.7568 - val_loss: 0.5285 - val_acc: 0.7534
Epoch 92/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5055 - acc: 0.7573 - val_loss: 0.5356 - val_acc: 0.7483
Epoch 93/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5050 - acc: 0.7571 - val_loss: 0.5573 - val_acc: 0.7509
Epoch 94/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5052 - acc: 0.7575 - val_loss: 0.5331 - val_acc: 0.7510
Epoch 95/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5054 - acc: 0.7574 - val_loss: 0.5303 - val_acc: 0.7522
Epoch 96/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5050 - acc: 0.7578 - val_loss: 0.5299 - val_acc: 0.7520
Epoch 97/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5047 - acc: 0.7578 - val_loss: 0.5341 - val_acc: 0.7511
Epoch 98/100
390870/390870 [==============================] - 29s 73us/step - loss: 0.5051 - acc: 0.7577 - val_loss: 0.5604 - val_acc: 0.7486
Epoch 99/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5057 - acc: 0.7576 - val_loss: 0.5413 - val_acc: 0.7446
Epoch 100/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5045 - acc: 0.7576 - val_loss: 0.5278 - val_acc: 0.7536
Out[18]:
<keras.callbacks.History at 0x209e032a048>
In [19]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_4.h5")
In [20]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 20s 35us/step
[0.5089105000514885, 0.758450247677916]
In [21]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
In [22]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
In [23]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.70      0.41      0.52     50930
           1       0.76      0.91      0.83    103063

    accuracy                           0.75    153993
   macro avg       0.73      0.66      0.67    153993
weighted avg       0.74      0.75      0.72    153993

Accuracy for Deep Learning approach: 74.58455903839784
In [24]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[20848 30082]
 [ 9056 94007]]
In [25]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()

7. Other experiments

7.1 SVM without dimensionality reduction

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[3]:
(558386, 2774)
In [4]:
best_model = _jl.load("../models/best_SVM.joblib")
best_model.set_params(verbose=10)
best_model.get_params()
Out[4]:
{'C': 0.001,
 'class_weight': None,
 'dual': True,
 'fit_intercept': True,
 'intercept_scaling': 1,
 'loss': 'squared_hinge',
 'max_iter': 50000,
 'multi_class': 'ovr',
 'penalty': 'l2',
 'random_state': 0,
 'tol': 0.0001,
 'verbose': 10}
In [5]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
[LibLinear]
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\sklearn\svm\base.py:929: ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.
  "the number of iterations.", ConvergenceWarning)
Out[5]:
LinearSVC(C=0.001, class_weight=None, dual=True, fit_intercept=True,
          intercept_scaling=1, loss='squared_hinge', max_iter=50000,
          multi_class='ovr', penalty='l2', random_state=0, tol=0.0001,
          verbose=10)
In [6]:
_jl.dump(best_model, "../models/best_SVM_all.joblib")
Out[6]:
['../models/best_SVM_all.joblib']
In [7]:
print("coef:", best_model.coef_)
print("intercept:", best_model.intercept_)
coef: [[-0.14394614 -0.20443318  0.34380749 ...  0.00380986  0.00857759
   0.        ]]
intercept: [-0.31154943]
In [8]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[8]:
(153993, 2774)
In [9]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
predictions:
 [1 0 1 ... 0 0 0]
In [10]:
set(predic)
Out[10]:
{0, 1}
In [11]:
# evaluate classifier

print("Report for Support Vector Machine:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Support Vector Machine:", _accuracy_score(test_set['likes'], predic)*100)
Report for Support Vector Machine:
              precision    recall  f1-score   support

           0       0.72      0.34      0.46     50930
           1       0.74      0.93      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.73      0.64      0.64    153993
weighted avg       0.73      0.74      0.71    153993

Accuracy for Support Vector Machine: 73.73971544161098
In [12]:
# Confusion matrix for SVC

print("Confusion Matrix for SVC: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for SVC: 
Out[12]:
array([[17293, 33637],
       [ 6802, 96261]], dtype=int64)
In [13]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("SVM ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()

7.2 Random forest without dimensionality reduction

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[3]:
(558386, 2774)
In [4]:
params = _jl.load("../models/best_Random_Forest_2.joblib").get_params()
params['n_jobs'] = -1
params['verbose'] = 10
best_model = _RandomForestClassifier(**params)
best_model.get_params()
Out[4]:
{'bootstrap': False,
 'class_weight': None,
 'criterion': 'entropy',
 'max_depth': 50,
 'max_features': 'auto',
 'max_leaf_nodes': None,
 'min_impurity_decrease': 0.0,
 'min_impurity_split': None,
 'min_samples_leaf': 2,
 'min_samples_split': 10,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 1000,
 'n_jobs': -1,
 'oob_score': False,
 'random_state': None,
 'verbose': 10,
 'warm_start': False}
In [5]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 12 concurrent workers.
building tree 1 of 1000
building tree 2 of 1000
building tree 3 of 1000
building tree 4 of 1000
building tree 5 of 1000
building tree 6 of 1000
building tree 7 of 1000
building tree 8 of 1000
building tree 9 of 1000
building tree 10 of 1000
building tree 11 of 1000
building tree 12 of 1000
[Parallel(n_jobs=-1)]: Done   1 tasks      | elapsed:   44.0s
building tree 13 of 1000
building tree 14 of 1000
building tree 15 of 1000
building tree 16 of 1000
building tree 17 of 1000
building tree 18 of 1000
building tree 19 of 1000
[Parallel(n_jobs=-1)]: Done   8 tasks      | elapsed:   53.2s
building tree 20 of 1000
building tree 21 of 1000
building tree 22 of 1000
building tree 23 of 1000
building tree 24 of 1000
building tree 25 of 1000
building tree 26 of 1000
building tree 27 of 1000
building tree 28 of 1000
[Parallel(n_jobs=-1)]: Done  17 tasks      | elapsed:  1.9min
building tree 29 of 1000
building tree 30 of 1000
building tree 31 of 1000
building tree 32 of 1000
building tree 33 of 1000
building tree 34 of 1000
building tree 35 of 1000
building tree 36 of 1000
building tree 37 of 1000
[Parallel(n_jobs=-1)]: Done  26 tasks      | elapsed:  2.7min
building tree 38 of 1000
building tree 39 of 1000
building tree 40 of 1000
building tree 41 of 1000
building tree 42 of 1000
building tree 43 of 1000
building tree 44 of 1000
building tree 45 of 1000
building tree 46 of 1000
building tree 47 of 1000
building tree 48 of 1000
[Parallel(n_jobs=-1)]: Done  37 tasks      | elapsed:  3.4min
building tree 49 of 1000
building tree 50 of 1000
building tree 51 of 1000
building tree 52 of 1000
building tree 53 of 1000
building tree 54 of 1000
building tree 55 of 1000
building tree 56 of 1000
building tree 57 of 1000
building tree 58 of 1000
building tree 59 of 1000
[Parallel(n_jobs=-1)]: Done  48 tasks      | elapsed:  4.0min
building tree 60 of 1000
building tree 61 of 1000
building tree 62 of 1000
building tree 63 of 1000
building tree 64 of 1000
building tree 65 of 1000
building tree 66 of 1000
building tree 67 of 1000
building tree 68 of 1000
building tree 69 of 1000
building tree 70 of 1000
building tree 71 of 1000
building tree 72 of 1000
[Parallel(n_jobs=-1)]: Done  61 tasks      | elapsed:  5.2min
building tree 73 of 1000
building tree 74 of 1000
building tree 75 of 1000
building tree 76 of 1000
building tree 77 of 1000
building tree 78 of 1000
building tree 79 of 1000
building tree 80 of 1000
building tree 81 of 1000
building tree 82 of 1000
building tree 83 of 1000
building tree 84 of 1000
building tree 85 of 1000
[Parallel(n_jobs=-1)]: Done  74 tasks      | elapsed:  6.1min
building tree 86 of 1000
building tree 87 of 1000
building tree 88 of 1000
building tree 89 of 1000
building tree 90 of 1000
building tree 91 of 1000
building tree 92 of 1000
building tree 93 of 1000
building tree 94 of 1000
building tree 95 of 1000
building tree 96 of 1000
building tree 97 of 1000
building tree 98 of 1000
building tree 99 of 1000
building tree 100 of 1000
[Parallel(n_jobs=-1)]: Done  89 tasks      | elapsed:  7.1min
building tree 101 of 1000
building tree 102 of 1000
building tree 103 of 1000
building tree 104 of 1000
building tree 105 of 1000
building tree 106 of 1000
building tree 107 of 1000
building tree 108 of 1000
building tree 109 of 1000
building tree 110 of 1000
building tree 111 of 1000
building tree 112 of 1000
building tree 113 of 1000
building tree 114 of 1000
building tree 115 of 1000
[Parallel(n_jobs=-1)]: Done 104 tasks      | elapsed:  8.1min
building tree 116 of 1000
building tree 117 of 1000
building tree 118 of 1000
building tree 119 of 1000
building tree 120 of 1000
building tree 121 of 1000
building tree 122 of 1000
building tree 123 of 1000
building tree 124 of 1000
building tree 125 of 1000
building tree 126 of 1000
building tree 127 of 1000
building tree 128 of 1000
building tree 129 of 1000
building tree 130 of 1000
building tree 131 of 1000
building tree 132 of 1000
[Parallel(n_jobs=-1)]: Done 121 tasks      | elapsed:  9.5min
building tree 133 of 1000
building tree 134 of 1000
building tree 135 of 1000
building tree 136 of 1000
building tree 137 of 1000
building tree 138 of 1000
building tree 139 of 1000
building tree 140 of 1000
building tree 141 of 1000
building tree 142 of 1000
building tree 143 of 1000
building tree 144 of 1000
building tree 145 of 1000
building tree 146 of 1000
building tree 147 of 1000
building tree 148 of 1000
building tree 149 of 1000
[Parallel(n_jobs=-1)]: Done 138 tasks      | elapsed: 10.7min
building tree 150 of 1000
building tree 151 of 1000
building tree 152 of 1000
building tree 153 of 1000
building tree 154 of 1000
building tree 155 of 1000
building tree 156 of 1000
building tree 157 of 1000
building tree 158 of 1000
building tree 159 of 1000
building tree 160 of 1000
building tree 161 of 1000
building tree 162 of 1000
building tree 163 of 1000
building tree 164 of 1000
building tree 165 of 1000
building tree 166 of 1000
building tree 167 of 1000
building tree 168 of 1000
[Parallel(n_jobs=-1)]: Done 157 tasks      | elapsed: 12.2min
building tree 169 of 1000
building tree 170 of 1000
building tree 171 of 1000
building tree 172 of 1000
building tree 173 of 1000
building tree 174 of 1000
building tree 175 of 1000
building tree 176 of 1000
building tree 177 of 1000
building tree 178 of 1000
building tree 179 of 1000
building tree 180 of 1000
building tree 181 of 1000
building tree 182 of 1000
building tree 183 of 1000
building tree 184 of 1000
building tree 185 of 1000
building tree 186 of 1000
building tree 187 of 1000
[Parallel(n_jobs=-1)]: Done 176 tasks      | elapsed: 13.3min
building tree 188 of 1000
building tree 189 of 1000
building tree 190 of 1000
building tree 191 of 1000
building tree 192 of 1000
building tree 193 of 1000
building tree 194 of 1000
building tree 195 of 1000
building tree 196 of 1000
building tree 197 of 1000
building tree 198 of 1000
building tree 199 of 1000
building tree 200 of 1000
building tree 201 of 1000
building tree 202 of 1000
building tree 203 of 1000
building tree 204 of 1000
building tree 205 of 1000
building tree 206 of 1000
building tree 207 of 1000
building tree 208 of 1000
building tree 209 of 1000
[Parallel(n_jobs=-1)]: Done 197 tasks      | elapsed: 14.8min
building tree 210 of 1000
building tree 211 of 1000
building tree 212 of 1000
building tree 213 of 1000
building tree 214 of 1000
building tree 215 of 1000
building tree 216 of 1000
building tree 217 of 1000
building tree 218 of 1000
building tree 219 of 1000
building tree 220 of 1000
building tree 221 of 1000
building tree 222 of 1000
building tree 223 of 1000
building tree 224 of 1000
building tree 225 of 1000
building tree 226 of 1000
building tree 227 of 1000
building tree 228 of 1000
building tree 229 of 1000
[Parallel(n_jobs=-1)]: Done 218 tasks      | elapsed: 16.4min
building tree 230 of 1000
building tree 231 of 1000
building tree 232 of 1000
building tree 233 of 1000
building tree 234 of 1000
building tree 235 of 1000
building tree 236 of 1000
building tree 237 of 1000
building tree 238 of 1000
building tree 239 of 1000
building tree 240 of 1000
building tree 241 of 1000
building tree 242 of 1000
building tree 243 of 1000
building tree 244 of 1000
building tree 245 of 1000
building tree 246 of 1000
building tree 247 of 1000
building tree 248 of 1000
building tree 249 of 1000
building tree 250 of 1000
building tree 251 of 1000
building tree 252 of 1000
[Parallel(n_jobs=-1)]: Done 241 tasks      | elapsed: 17.9min
building tree 253 of 1000
building tree 254 of 1000
building tree 255 of 1000
building tree 256 of 1000
building tree 257 of 1000
building tree 258 of 1000
building tree 259 of 1000
building tree 260 of 1000
building tree 261 of 1000
building tree 262 of 1000
building tree 263 of 1000
building tree 264 of 1000
building tree 265 of 1000
building tree 266 of 1000
building tree 267 of 1000
building tree 268 of 1000
building tree 269 of 1000
building tree 270 of 1000
building tree 271 of 1000
building tree 272 of 1000
building tree 273 of 1000
building tree 274 of 1000
building tree 275 of 1000
[Parallel(n_jobs=-1)]: Done 264 tasks      | elapsed: 19.6min
building tree 276 of 1000
building tree 277 of 1000
building tree 278 of 1000
building tree 279 of 1000
building tree 280 of 1000
building tree 281 of 1000
building tree 282 of 1000
building tree 283 of 1000
building tree 284 of 1000
building tree 285 of 1000
building tree 286 of 1000
building tree 287 of 1000
building tree 288 of 1000
building tree 289 of 1000
building tree 290 of 1000
building tree 291 of 1000
building tree 292 of 1000
building tree 293 of 1000
building tree 294 of 1000
building tree 295 of 1000
building tree 296 of 1000
building tree 297 of 1000
building tree 298 of 1000
building tree 299 of 1000
building tree 300 of 1000
[Parallel(n_jobs=-1)]: Done 289 tasks      | elapsed: 21.5min
building tree 301 of 1000
building tree 302 of 1000
building tree 303 of 1000
building tree 304 of 1000
building tree 305 of 1000
building tree 306 of 1000
building tree 307 of 1000
building tree 308 of 1000
building tree 309 of 1000
building tree 310 of 1000
building tree 311 of 1000
building tree 312 of 1000
building tree 313 of 1000
building tree 314 of 1000
building tree 315 of 1000
building tree 316 of 1000
building tree 317 of 1000
building tree 318 of 1000
building tree 319 of 1000
building tree 320 of 1000
building tree 321 of 1000
building tree 322 of 1000
building tree 323 of 1000
building tree 324 of 1000
building tree 325 of 1000
[Parallel(n_jobs=-1)]: Done 314 tasks      | elapsed: 23.2min
building tree 326 of 1000
building tree 327 of 1000
building tree 328 of 1000
building tree 329 of 1000
building tree 330 of 1000
building tree 331 of 1000
building tree 332 of 1000
building tree 333 of 1000
building tree 334 of 1000
building tree 335 of 1000
building tree 336 of 1000
building tree 337 of 1000
building tree 338 of 1000
building tree 339 of 1000
building tree 340 of 1000
building tree 341 of 1000
building tree 342 of 1000
building tree 343 of 1000
building tree 344 of 1000
building tree 345 of 1000
building tree 346 of 1000
building tree 347 of 1000
building tree 348 of 1000
building tree 349 of 1000
building tree 350 of 1000
building tree 351 of 1000
building tree 352 of 1000
[Parallel(n_jobs=-1)]: Done 341 tasks      | elapsed: 25.2min
building tree 353 of 1000
building tree 354 of 1000
building tree 355 of 1000
building tree 356 of 1000
building tree 357 of 1000
building tree 358 of 1000
building tree 359 of 1000
building tree 360 of 1000
building tree 361 of 1000
building tree 362 of 1000
building tree 363 of 1000
building tree 364 of 1000
building tree 365 of 1000
building tree 366 of 1000
building tree 367 of 1000
building tree 368 of 1000
building tree 369 of 1000
building tree 370 of 1000
building tree 371 of 1000
building tree 372 of 1000
building tree 373 of 1000
building tree 374 of 1000
building tree 375 of 1000
building tree 376 of 1000
building tree 377 of 1000
building tree 378 of 1000
building tree 379 of 1000
[Parallel(n_jobs=-1)]: Done 368 tasks      | elapsed: 27.0min
building tree 380 of 1000
building tree 381 of 1000
building tree 382 of 1000
building tree 383 of 1000
building tree 384 of 1000
building tree 385 of 1000
building tree 386 of 1000
building tree 387 of 1000
building tree 388 of 1000
building tree 389 of 1000
building tree 390 of 1000
building tree 391 of 1000
building tree 392 of 1000
building tree 393 of 1000
building tree 394 of 1000
building tree 395 of 1000
building tree 396 of 1000
building tree 397 of 1000
building tree 398 of 1000
building tree 399 of 1000
building tree 400 of 1000
building tree 401 of 1000
building tree 402 of 1000
building tree 403 of 1000
building tree 404 of 1000
building tree 405 of 1000
building tree 406 of 1000
building tree 407 of 1000
building tree 408 of 1000
[Parallel(n_jobs=-1)]: Done 397 tasks      | elapsed: 29.2min
building tree 409 of 1000
building tree 410 of 1000
building tree 411 of 1000
building tree 412 of 1000
building tree 413 of 1000
building tree 414 of 1000
building tree 415 of 1000
building tree 416 of 1000
building tree 417 of 1000
building tree 418 of 1000
building tree 419 of 1000
building tree 420 of 1000
building tree 421 of 1000
building tree 422 of 1000
building tree 423 of 1000
building tree 424 of 1000
building tree 425 of 1000
building tree 426 of 1000
building tree 427 of 1000
building tree 428 of 1000
building tree 429 of 1000
building tree 430 of 1000
building tree 431 of 1000
building tree 432 of 1000
building tree 433 of 1000
building tree 434 of 1000
building tree 435 of 1000
building tree 436 of 1000
building tree 437 of 1000
[Parallel(n_jobs=-1)]: Done 426 tasks      | elapsed: 31.3min
building tree 438 of 1000
building tree 439 of 1000
building tree 440 of 1000
building tree 441 of 1000
building tree 442 of 1000
building tree 443 of 1000
building tree 444 of 1000
building tree 445 of 1000
building tree 446 of 1000
building tree 447 of 1000
building tree 448 of 1000
building tree 449 of 1000
building tree 450 of 1000
building tree 451 of 1000
building tree 452 of 1000
building tree 453 of 1000
building tree 454 of 1000
building tree 455 of 1000
building tree 456 of 1000
building tree 457 of 1000
building tree 458 of 1000
building tree 459 of 1000
building tree 460 of 1000
building tree 461 of 1000
building tree 462 of 1000
building tree 463 of 1000
building tree 464 of 1000
building tree 465 of 1000
building tree 466 of 1000
building tree 467 of 1000
building tree 468 of 1000
building tree 469 of 1000
[Parallel(n_jobs=-1)]: Done 457 tasks      | elapsed: 33.5min
building tree 470 of 1000
building tree 471 of 1000
building tree 472 of 1000
building tree 473 of 1000
building tree 474 of 1000
building tree 475 of 1000
building tree 476 of 1000
building tree 477 of 1000
building tree 478 of 1000
building tree 479 of 1000
building tree 480 of 1000
building tree 481 of 1000
building tree 482 of 1000
building tree 483 of 1000
building tree 484 of 1000
building tree 485 of 1000
building tree 486 of 1000
building tree 487 of 1000
building tree 488 of 1000
building tree 489 of 1000
building tree 490 of 1000
building tree 491 of 1000
building tree 492 of 1000
building tree 493 of 1000
building tree 494 of 1000
building tree 495 of 1000
building tree 496 of 1000
building tree 497 of 1000
building tree 498 of 1000
building tree 499 of 1000
[Parallel(n_jobs=-1)]: Done 488 tasks      | elapsed: 35.6min
building tree 500 of 1000
building tree 501 of 1000
building tree 502 of 1000
building tree 503 of 1000
building tree 504 of 1000
building tree 505 of 1000
building tree 506 of 1000
building tree 507 of 1000
building tree 508 of 1000
building tree 509 of 1000
building tree 510 of 1000
building tree 511 of 1000
building tree 512 of 1000
building tree 513 of 1000
building tree 514 of 1000
building tree 515 of 1000
building tree 516 of 1000
building tree 517 of 1000
building tree 518 of 1000
building tree 519 of 1000
building tree 520 of 1000
building tree 521 of 1000
building tree 522 of 1000
building tree 523 of 1000building tree 524 of 1000

building tree 525 of 1000
building tree 526 of 1000
building tree 527 of 1000
building tree 528 of 1000
building tree 529 of 1000
building tree 530 of 1000
building tree 531 of 1000
building tree 532 of 1000
building tree 533 of 1000
[Parallel(n_jobs=-1)]: Done 521 tasks      | elapsed: 38.1min
building tree 534 of 1000
building tree 535 of 1000
building tree 536 of 1000
building tree 537 of 1000
building tree 538 of 1000
building tree 539 of 1000
building tree 540 of 1000
building tree 541 of 1000
building tree 542 of 1000
building tree 543 of 1000
building tree 544 of 1000
building tree 545 of 1000
building tree 546 of 1000
building tree 547 of 1000
building tree 548 of 1000
building tree 549 of 1000
building tree 550 of 1000
building tree 551 of 1000
building tree 552 of 1000
building tree 553 of 1000
building tree 554 of 1000
building tree 555 of 1000
building tree 556 of 1000
building tree 557 of 1000
building tree 558 of 1000
building tree 559 of 1000
building tree 560 of 1000
building tree 561 of 1000
building tree 562 of 1000
building tree 563 of 1000
building tree 564 of 1000
building tree 565 of 1000
[Parallel(n_jobs=-1)]: Done 554 tasks      | elapsed: 40.4min
building tree 566 of 1000
building tree 567 of 1000
building tree 568 of 1000
building tree 569 of 1000
building tree 570 of 1000
building tree 571 of 1000
building tree 572 of 1000
building tree 573 of 1000
building tree 574 of 1000
building tree 575 of 1000
building tree 576 of 1000
building tree 577 of 1000
building tree 578 of 1000
building tree 579 of 1000
building tree 580 of 1000
building tree 581 of 1000
building tree 582 of 1000
building tree 583 of 1000
building tree 584 of 1000
building tree 585 of 1000
building tree 586 of 1000
building tree 587 of 1000
building tree 588 of 1000
building tree 589 of 1000
building tree 590 of 1000
building tree 591 of 1000
building tree 592 of 1000
building tree 593 of 1000
building tree 594 of 1000
building tree 595 of 1000
building tree 596 of 1000
building tree 597 of 1000
building tree 598 of 1000
building tree 599 of 1000
building tree 600 of 1000
[Parallel(n_jobs=-1)]: Done 589 tasks      | elapsed: 43.1min
building tree 601 of 1000
building tree 602 of 1000
building tree 603 of 1000
building tree 604 of 1000
building tree 605 of 1000
building tree 606 of 1000
building tree 607 of 1000
building tree 608 of 1000
building tree 609 of 1000
building tree 610 of 1000
building tree 611 of 1000
building tree 612 of 1000
building tree 613 of 1000
building tree 614 of 1000
building tree 615 of 1000
building tree 616 of 1000
building tree 617 of 1000
building tree 618 of 1000
building tree 619 of 1000
building tree 620 of 1000
building tree 621 of 1000
building tree 622 of 1000
building tree 623 of 1000
building tree 624 of 1000
building tree 625 of 1000
building tree 626 of 1000
building tree 627 of 1000
building tree 628 of 1000
building tree 629 of 1000
building tree 630 of 1000
building tree 631 of 1000building tree 632 of 1000

building tree 633 of 1000
building tree 634 of 1000
building tree 635 of 1000
[Parallel(n_jobs=-1)]: Done 624 tasks      | elapsed: 45.4min
building tree 636 of 1000
building tree 637 of 1000
building tree 638 of 1000
building tree 639 of 1000
building tree 640 of 1000
building tree 641 of 1000
building tree 642 of 1000
building tree 643 of 1000
building tree 644 of 1000
building tree 645 of 1000
building tree 646 of 1000
building tree 647 of 1000
building tree 648 of 1000
building tree 649 of 1000
building tree 650 of 1000
building tree 651 of 1000
building tree 652 of 1000
building tree 653 of 1000
building tree 654 of 1000
building tree 655 of 1000
building tree 656 of 1000
building tree 657 of 1000
building tree 658 of 1000
building tree 659 of 1000
building tree 660 of 1000
building tree 661 of 1000
building tree 662 of 1000
building tree 663 of 1000
building tree 664 of 1000
building tree 665 of 1000
building tree 666 of 1000
building tree 667 of 1000
building tree 668 of 1000
building tree 669 of 1000
building tree 670 of 1000
building tree 671 of 1000
building tree 672 of 1000
[Parallel(n_jobs=-1)]: Done 661 tasks      | elapsed: 48.3min
building tree 673 of 1000
building tree 674 of 1000
building tree 675 of 1000
building tree 676 of 1000
building tree 677 of 1000
building tree 678 of 1000
building tree 679 of 1000
building tree 680 of 1000
building tree 681 of 1000
building tree 682 of 1000
building tree 683 of 1000
building tree 684 of 1000
building tree 685 of 1000
building tree 686 of 1000
building tree 687 of 1000
building tree 688 of 1000
building tree 689 of 1000
building tree 690 of 1000
building tree 691 of 1000
building tree 692 of 1000
building tree 693 of 1000
building tree 694 of 1000
building tree 695 of 1000
building tree 696 of 1000
building tree 697 of 1000
building tree 698 of 1000
building tree 699 of 1000
building tree 700 of 1000
building tree 701 of 1000
building tree 702 of 1000
building tree 703 of 1000
building tree 704 of 1000
building tree 705 of 1000
building tree 706 of 1000
building tree 707 of 1000
building tree 708 of 1000
building tree 709 of 1000
[Parallel(n_jobs=-1)]: Done 698 tasks      | elapsed: 50.8min
building tree 710 of 1000
building tree 711 of 1000
building tree 712 of 1000
building tree 713 of 1000
building tree 714 of 1000
building tree 715 of 1000
building tree 716 of 1000
building tree 717 of 1000
building tree 718 of 1000
building tree 719 of 1000
building tree 720 of 1000
building tree 721 of 1000
building tree 722 of 1000
building tree 723 of 1000
building tree 724 of 1000
building tree 725 of 1000
building tree 726 of 1000
building tree 727 of 1000
building tree 728 of 1000
building tree 729 of 1000
building tree 730 of 1000
building tree 731 of 1000
building tree 732 of 1000
building tree 733 of 1000
building tree 734 of 1000
building tree 735 of 1000
building tree 736 of 1000
building tree 737 of 1000
building tree 738 of 1000
building tree 739 of 1000
building tree 740 of 1000
building tree 741 of 1000
building tree 742 of 1000
building tree 743 of 1000
building tree 744 of 1000
building tree 745 of 1000
building tree 746 of 1000
building tree 747 of 1000
building tree 748 of 1000
building tree 749 of 1000
[Parallel(n_jobs=-1)]: Done 737 tasks      | elapsed: 53.5min
building tree 750 of 1000
building tree 751 of 1000
building tree 752 of 1000
building tree 753 of 1000
building tree 754 of 1000
building tree 755 of 1000
building tree 756 of 1000
building tree 757 of 1000
building tree 758 of 1000
building tree 759 of 1000
building tree 760 of 1000
building tree 761 of 1000
building tree 762 of 1000
building tree 763 of 1000
building tree 764 of 1000
building tree 765 of 1000
building tree 766 of 1000
building tree 767 of 1000
building tree 768 of 1000
building tree 769 of 1000
building tree 770 of 1000
building tree 771 of 1000
building tree 772 of 1000
building tree 773 of 1000
building tree 774 of 1000
building tree 775 of 1000
building tree 776 of 1000
building tree 777 of 1000
building tree 778 of 1000
building tree 779 of 1000
building tree 780 of 1000
building tree 781 of 1000
building tree 782 of 1000
building tree 783 of 1000
building tree 784 of 1000
building tree 785 of 1000
building tree 786 of 1000
building tree 787 of 1000
[Parallel(n_jobs=-1)]: Done 776 tasks      | elapsed: 56.3min
building tree 788 of 1000
building tree 789 of 1000
building tree 790 of 1000
building tree 791 of 1000
building tree 792 of 1000
building tree 793 of 1000
building tree 794 of 1000
building tree 795 of 1000
building tree 796 of 1000
building tree 797 of 1000
building tree 798 of 1000
building tree 799 of 1000
building tree 800 of 1000
building tree 801 of 1000
building tree 802 of 1000
building tree 803 of 1000
building tree 804 of 1000
building tree 805 of 1000
building tree 806 of 1000
building tree 807 of 1000
building tree 808 of 1000
building tree 809 of 1000
building tree 810 of 1000
building tree 811 of 1000
building tree 812 of 1000
building tree 813 of 1000
building tree 814 of 1000
building tree 815 of 1000
building tree 816 of 1000
building tree 817 of 1000
building tree 818 of 1000
building tree 819 of 1000
building tree 820 of 1000
building tree 821 of 1000
building tree 822 of 1000
building tree 823 of 1000
building tree 824 of 1000
building tree 825 of 1000
building tree 826 of 1000
building tree 827 of 1000
building tree 828 of 1000
[Parallel(n_jobs=-1)]: Done 817 tasks      | elapsed: 59.5min
building tree 829 of 1000
building tree 830 of 1000
building tree 831 of 1000
building tree 832 of 1000
building tree 833 of 1000
building tree 834 of 1000
building tree 835 of 1000
building tree 836 of 1000
building tree 837 of 1000
building tree 838 of 1000
building tree 839 of 1000
building tree 840 of 1000
building tree 841 of 1000
building tree 842 of 1000
building tree 843 of 1000
building tree 844 of 1000
building tree 845 of 1000
building tree 846 of 1000
building tree 847 of 1000
building tree 848 of 1000
building tree 849 of 1000
building tree 850 of 1000
building tree 851 of 1000
building tree 852 of 1000
building tree 853 of 1000
building tree 854 of 1000
building tree 855 of 1000
building tree 856 of 1000
building tree 857 of 1000
building tree 858 of 1000
building tree 859 of 1000
building tree 860 of 1000
building tree 861 of 1000
building tree 862 of 1000
building tree 863 of 1000
building tree 864 of 1000
building tree 865 of 1000
building tree 866 of 1000
building tree 867 of 1000
building tree 868 of 1000
building tree 869 of 1000
building tree 870 of 1000
[Parallel(n_jobs=-1)]: Done 858 tasks      | elapsed: 62.3min
building tree 871 of 1000
building tree 872 of 1000
building tree 873 of 1000
building tree 874 of 1000
building tree 875 of 1000
building tree 876 of 1000
building tree 877 of 1000
building tree 878 of 1000
building tree 879 of 1000
building tree 880 of 1000
building tree 881 of 1000
building tree 882 of 1000
building tree 883 of 1000
building tree 884 of 1000
building tree 885 of 1000
building tree 886 of 1000
building tree 887 of 1000
building tree 888 of 1000
building tree 889 of 1000
building tree 890 of 1000
building tree 891 of 1000
building tree 892 of 1000
building tree 893 of 1000
building tree 894 of 1000
building tree 895 of 1000
building tree 896 of 1000
building tree 897 of 1000
building tree 898 of 1000
building tree 899 of 1000
building tree 900 of 1000
building tree 901 of 1000
building tree 902 of 1000
building tree 903 of 1000
building tree 904 of 1000
building tree 905 of 1000
building tree 906 of 1000
building tree 907 of 1000
building tree 908 of 1000
building tree 909 of 1000
building tree 910 of 1000
building tree 911 of 1000
building tree 912 of 1000
[Parallel(n_jobs=-1)]: Done 901 tasks      | elapsed: 65.4min
building tree 913 of 1000
building tree 914 of 1000
building tree 915 of 1000
building tree 916 of 1000
building tree 917 of 1000
building tree 918 of 1000
building tree 919 of 1000
building tree 920 of 1000
building tree 921 of 1000
building tree 922 of 1000
building tree 923 of 1000
building tree 924 of 1000
building tree 925 of 1000
building tree 926 of 1000
building tree 927 of 1000
building tree 928 of 1000
building tree 929 of 1000
building tree 930 of 1000
building tree 931 of 1000
building tree 932 of 1000
building tree 933 of 1000
building tree 934 of 1000
building tree 935 of 1000
building tree 936 of 1000
building tree 937 of 1000
building tree 938 of 1000
building tree 939 of 1000
building tree 940 of 1000
building tree 941 of 1000
building tree 942 of 1000
building tree 943 of 1000
building tree 944 of 1000
building tree 945 of 1000
building tree 946 of 1000
building tree 947 of 1000
building tree 948 of 1000
building tree 949 of 1000
building tree 950 of 1000
building tree 951 of 1000
building tree 952 of 1000
building tree 953 of 1000
building tree 954 of 1000
building tree 955 of 1000
[Parallel(n_jobs=-1)]: Done 944 tasks      | elapsed: 68.6min
building tree 956 of 1000
building tree 957 of 1000
building tree 958 of 1000
building tree 959 of 1000
building tree 960 of 1000
building tree 961 of 1000
building tree 962 of 1000
building tree 963 of 1000
building tree 964 of 1000
building tree 965 of 1000
building tree 966 of 1000
building tree 967 of 1000
building tree 968 of 1000
building tree 969 of 1000
building tree 970 of 1000
building tree 971 of 1000
building tree 972 of 1000
building tree 973 of 1000
building tree 974 of 1000
building tree 975 of 1000
building tree 976 of 1000
building tree 977 of 1000
building tree 978 of 1000
building tree 979 of 1000
building tree 980 of 1000
building tree 981 of 1000
building tree 982 of 1000
building tree 983 of 1000
building tree 984 of 1000
building tree 985 of 1000
building tree 986 of 1000
building tree 987 of 1000
building tree 988 of 1000
building tree 989 of 1000
building tree 990 of 1000
building tree 991 of 1000
building tree 992 of 1000
building tree 993 of 1000
building tree 994 of 1000
building tree 995 of 1000
building tree 996 of 1000
building tree 997 of 1000
building tree 998 of 1000
building tree 999 of 1000
building tree 1000 of 1000
[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed: 72.3min finished
Out[5]:
RandomForestClassifier(bootstrap=False, class_weight=None, criterion='entropy',
                       max_depth=50, max_features='auto', max_leaf_nodes=None,
                       min_impurity_decrease=0.0, min_impurity_split=None,
                       min_samples_leaf=2, min_samples_split=10,
                       min_weight_fraction_leaf=0.0, n_estimators=1000,
                       n_jobs=-1, oob_score=False, random_state=None,
                       verbose=10, warm_start=False)
In [6]:
_jl.dump(best_model, "../models/best_Random_Forest_all.joblib")
Out[6]:
['../models/best_Random_Forest_all.joblib']
In [7]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[7]:
(153993, 2774)
In [8]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
[Parallel(n_jobs=12)]: Using backend ThreadingBackend with 12 concurrent workers.
[Parallel(n_jobs=12)]: Done   1 tasks      | elapsed:    0.4s
[Parallel(n_jobs=12)]: Done   8 tasks      | elapsed:    0.4s
[Parallel(n_jobs=12)]: Done  17 tasks      | elapsed:    0.8s
[Parallel(n_jobs=12)]: Done  26 tasks      | elapsed:    1.2s
[Parallel(n_jobs=12)]: Done  37 tasks      | elapsed:    1.5s
[Parallel(n_jobs=12)]: Done  48 tasks      | elapsed:    1.8s
[Parallel(n_jobs=12)]: Done  61 tasks      | elapsed:    2.5s
[Parallel(n_jobs=12)]: Done  74 tasks      | elapsed:    3.0s
[Parallel(n_jobs=12)]: Done  89 tasks      | elapsed:    3.4s
[Parallel(n_jobs=12)]: Done 104 tasks      | elapsed:    3.9s
[Parallel(n_jobs=12)]: Done 121 tasks      | elapsed:    4.5s
[Parallel(n_jobs=12)]: Done 138 tasks      | elapsed:    5.0s
[Parallel(n_jobs=12)]: Done 157 tasks      | elapsed:    5.7s
[Parallel(n_jobs=12)]: Done 176 tasks      | elapsed:    6.3s
[Parallel(n_jobs=12)]: Done 197 tasks      | elapsed:    7.0s
[Parallel(n_jobs=12)]: Done 218 tasks      | elapsed:    7.7s
[Parallel(n_jobs=12)]: Done 241 tasks      | elapsed:    8.5s
[Parallel(n_jobs=12)]: Done 264 tasks      | elapsed:    9.2s
[Parallel(n_jobs=12)]: Done 289 tasks      | elapsed:   10.1s
[Parallel(n_jobs=12)]: Done 314 tasks      | elapsed:   10.9s
[Parallel(n_jobs=12)]: Done 341 tasks      | elapsed:   11.7s
[Parallel(n_jobs=12)]: Done 368 tasks      | elapsed:   12.8s
[Parallel(n_jobs=12)]: Done 397 tasks      | elapsed:   14.3s
[Parallel(n_jobs=12)]: Done 426 tasks      | elapsed:   15.7s
[Parallel(n_jobs=12)]: Done 457 tasks      | elapsed:   17.1s
[Parallel(n_jobs=12)]: Done 488 tasks      | elapsed:   18.5s
[Parallel(n_jobs=12)]: Done 521 tasks      | elapsed:   19.8s
[Parallel(n_jobs=12)]: Done 554 tasks      | elapsed:   21.1s
[Parallel(n_jobs=12)]: Done 589 tasks      | elapsed:   22.4s
[Parallel(n_jobs=12)]: Done 624 tasks      | elapsed:   23.9s
[Parallel(n_jobs=12)]: Done 661 tasks      | elapsed:   25.4s
[Parallel(n_jobs=12)]: Done 698 tasks      | elapsed:   26.9s
[Parallel(n_jobs=12)]: Done 737 tasks      | elapsed:   28.4s
[Parallel(n_jobs=12)]: Done 776 tasks      | elapsed:   30.0s
[Parallel(n_jobs=12)]: Done 817 tasks      | elapsed:   31.5s
[Parallel(n_jobs=12)]: Done 858 tasks      | elapsed:   33.0s
[Parallel(n_jobs=12)]: Done 901 tasks      | elapsed:   34.5s
[Parallel(n_jobs=12)]: Done 944 tasks      | elapsed:   36.1s
[Parallel(n_jobs=12)]: Done 1000 out of 1000 | elapsed:   37.9s finished
predictions:
 [0 0 1 ... 0 1 0]
In [9]:
set(predic)
Out[9]:
{0, 1}
In [10]:
# evaluate classifier

print("Report for Random Forest classifier:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Random Forest Classifier:", _accuracy_score(test_set['likes'], predic)*100)
Report for Random Forest classifier:
              precision    recall  f1-score   support

           0       0.69      0.38      0.49     50930
           1       0.75      0.92      0.82    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.65      0.66    153993
weighted avg       0.73      0.74      0.71    153993

Accuracy for Random Forest Classifier: 73.88907287993611
In [11]:
# Confusion matrix for Random Forest

print("Confusion Matrix for Random Forest: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for Random Forest: 
Out[11]:
array([[19194, 31736],
       [ 8473, 94590]], dtype=int64)
In [12]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("Random Forest ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()

7.3 Deep Learning without dimensionality reduction

NN1

alpha = 6

# hidden layers = 3

batch size = 100

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[3]:
(558386, 2774)
In [4]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 6

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
34
In [5]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [6]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 180s 461us/step - loss: 0.5932 - acc: 0.6968 - val_loss: 0.5693 - val_acc: 0.7170
Epoch 2/100
390870/390870 [==============================] - 143s 366us/step - loss: 0.5560 - acc: 0.7226 - val_loss: 0.5513 - val_acc: 0.7321
Epoch 3/100
390870/390870 [==============================] - 154s 395us/step - loss: 0.5464 - acc: 0.7302 - val_loss: 0.5414 - val_acc: 0.7336
Epoch 4/100
390870/390870 [==============================] - 131s 336us/step - loss: 0.5406 - acc: 0.7342 - val_loss: 0.5370 - val_acc: 0.7369
Epoch 5/100
390870/390870 [==============================] - 148s 379us/step - loss: 0.5383 - acc: 0.7357 - val_loss: 0.5326 - val_acc: 0.7431
Epoch 6/100
390870/390870 [==============================] - 163s 418us/step - loss: 0.5355 - acc: 0.7381 - val_loss: 0.5380 - val_acc: 0.7398
Epoch 7/100
390870/390870 [==============================] - 165s 422us/step - loss: 0.5340 - acc: 0.7391 - val_loss: 0.5382 - val_acc: 0.7412
Epoch 8/100
390870/390870 [==============================] - 164s 421us/step - loss: 0.5336 - acc: 0.7393 - val_loss: 0.5378 - val_acc: 0.7378
Epoch 9/100
390870/390870 [==============================] - 162s 414us/step - loss: 0.5312 - acc: 0.7413 - val_loss: 0.5396 - val_acc: 0.7402
Epoch 10/100
390870/390870 [==============================] - 165s 423us/step - loss: 0.5299 - acc: 0.7419 - val_loss: 0.5340 - val_acc: 0.7426
Epoch 11/100
390870/390870 [==============================] - 148s 380us/step - loss: 0.5294 - acc: 0.7427 - val_loss: 0.5297 - val_acc: 0.7432
Epoch 12/100
390870/390870 [==============================] - 160s 410us/step - loss: 0.5273 - acc: 0.7441 - val_loss: 0.5437 - val_acc: 0.7336
Epoch 13/100
390870/390870 [==============================] - 158s 405us/step - loss: 0.5268 - acc: 0.7442 - val_loss: 0.5254 - val_acc: 0.7457
Epoch 14/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5254 - acc: 0.7456 - val_loss: 0.5336 - val_acc: 0.7410
Epoch 15/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5254 - acc: 0.7451 - val_loss: 0.5256 - val_acc: 0.7471
Epoch 16/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5242 - acc: 0.7462 - val_loss: 0.5331 - val_acc: 0.7412
Epoch 17/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5242 - acc: 0.7461 - val_loss: 0.5247 - val_acc: 0.7477
Epoch 18/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5228 - acc: 0.7475 - val_loss: 0.5269 - val_acc: 0.7450
Epoch 19/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5239 - acc: 0.7469 - val_loss: 0.5239 - val_acc: 0.7480
Epoch 20/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5225 - acc: 0.7471 - val_loss: 0.5273 - val_acc: 0.7458
Epoch 21/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5220 - acc: 0.7477 - val_loss: 0.5330 - val_acc: 0.7401
Epoch 22/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5199 - acc: 0.7493 - val_loss: 0.5249 - val_acc: 0.7474
Epoch 23/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5196 - acc: 0.7490 - val_loss: 0.5430 - val_acc: 0.7456
Epoch 24/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5185 - acc: 0.7495 - val_loss: 0.5225 - val_acc: 0.7485
Epoch 25/100
390870/390870 [==============================] - 63s 161us/step - loss: 0.5180 - acc: 0.7501 - val_loss: 0.5236 - val_acc: 0.7483
Epoch 26/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5194 - acc: 0.7494 - val_loss: 0.5216 - val_acc: 0.7493
Epoch 27/100
390870/390870 [==============================] - 68s 175us/step - loss: 0.5184 - acc: 0.7498 - val_loss: 0.5272 - val_acc: 0.7468
Epoch 28/100
390870/390870 [==============================] - 71s 182us/step - loss: 0.5172 - acc: 0.7507 - val_loss: 0.5282 - val_acc: 0.7452
Epoch 29/100
390870/390870 [==============================] - 71s 181us/step - loss: 0.5174 - acc: 0.7503 - val_loss: 0.5236 - val_acc: 0.7483
Epoch 30/100
390870/390870 [==============================] - 70s 180us/step - loss: 0.5165 - acc: 0.7512 - val_loss: 0.5249 - val_acc: 0.7474
Epoch 31/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5163 - acc: 0.7509 - val_loss: 0.5443 - val_acc: 0.7423
Epoch 32/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5157 - acc: 0.7513 - val_loss: 0.5229 - val_acc: 0.7493
Epoch 33/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5165 - acc: 0.7510 - val_loss: 0.5228 - val_acc: 0.7485
Epoch 34/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5151 - acc: 0.7518 - val_loss: 0.5211 - val_acc: 0.7493
Epoch 35/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5148 - acc: 0.7523 - val_loss: 0.5217 - val_acc: 0.7504
Epoch 36/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5142 - acc: 0.7519 - val_loss: 0.5242 - val_acc: 0.7489
Epoch 37/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5130 - acc: 0.7527 - val_loss: 0.5208 - val_acc: 0.7496
Epoch 38/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5132 - acc: 0.7533 - val_loss: 0.5242 - val_acc: 0.7489
Epoch 39/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5136 - acc: 0.7528 - val_loss: 0.5211 - val_acc: 0.7504
Epoch 40/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5131 - acc: 0.7529 - val_loss: 0.5211 - val_acc: 0.7520
Epoch 41/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5122 - acc: 0.7534 - val_loss: 0.5201 - val_acc: 0.7498
Epoch 42/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5125 - acc: 0.7531 - val_loss: 0.5371 - val_acc: 0.7479
Epoch 43/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5127 - acc: 0.7531 - val_loss: 0.5216 - val_acc: 0.7474
Epoch 44/100
390870/390870 [==============================] - 66s 170us/step - loss: 0.5124 - acc: 0.7535 - val_loss: 0.5237 - val_acc: 0.7483
Epoch 45/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5110 - acc: 0.7542 - val_loss: 0.5254 - val_acc: 0.7495
Epoch 46/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5116 - acc: 0.7541 - val_loss: 0.5196 - val_acc: 0.7490
Epoch 47/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5117 - acc: 0.7535 - val_loss: 0.5290 - val_acc: 0.7438
Epoch 48/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5112 - acc: 0.7543 - val_loss: 0.5260 - val_acc: 0.7486
Epoch 49/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5106 - acc: 0.7545 - val_loss: 0.5208 - val_acc: 0.7501
Epoch 50/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5106 - acc: 0.7540 - val_loss: 0.5226 - val_acc: 0.7493
Epoch 51/100
390870/390870 [==============================] - 63s 162us/step - loss: 0.5115 - acc: 0.7539 - val_loss: 0.5251 - val_acc: 0.7476
Epoch 52/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5101 - acc: 0.7547 - val_loss: 0.5229 - val_acc: 0.7481
Epoch 53/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5101 - acc: 0.7549 - val_loss: 0.5207 - val_acc: 0.7514
Epoch 54/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5099 - acc: 0.7551 - val_loss: 0.5192 - val_acc: 0.7514
Epoch 55/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5093 - acc: 0.7551 - val_loss: 0.5207 - val_acc: 0.7492
Epoch 56/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5095 - acc: 0.7552 - val_loss: 0.5232 - val_acc: 0.7467
Epoch 57/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5096 - acc: 0.7549 - val_loss: 0.5234 - val_acc: 0.7490
Epoch 58/100
390870/390870 [==============================] - 63s 162us/step - loss: 0.5091 - acc: 0.7550 - val_loss: 0.5194 - val_acc: 0.7495
Epoch 59/100
390870/390870 [==============================] - 63s 161us/step - loss: 0.5115 - acc: 0.7540 - val_loss: 0.5216 - val_acc: 0.7510
Epoch 60/100
390870/390870 [==============================] - 63s 161us/step - loss: 0.5085 - acc: 0.7554 - val_loss: 0.5181 - val_acc: 0.7511
Epoch 61/100
390870/390870 [==============================] - 63s 162us/step - loss: 0.5088 - acc: 0.7554 - val_loss: 0.5281 - val_acc: 0.7476
Epoch 62/100
390870/390870 [==============================] - 63s 162us/step - loss: 0.5089 - acc: 0.7552 - val_loss: 0.5264 - val_acc: 0.7459
Epoch 63/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5097 - acc: 0.7548 - val_loss: 0.5209 - val_acc: 0.7489
Epoch 64/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5105 - acc: 0.7545 - val_loss: 0.5215 - val_acc: 0.7495
Epoch 65/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5115 - acc: 0.7535 - val_loss: 0.5253 - val_acc: 0.7475
Epoch 66/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5101 - acc: 0.7554 - val_loss: 0.5272 - val_acc: 0.7430
Epoch 67/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5081 - acc: 0.7556 - val_loss: 0.5202 - val_acc: 0.7510
Epoch 68/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5076 - acc: 0.7558 - val_loss: 0.5277 - val_acc: 0.7463
Epoch 69/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5092 - acc: 0.7555 - val_loss: 0.5229 - val_acc: 0.7468
Epoch 70/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5090 - acc: 0.7557 - val_loss: 0.5229 - val_acc: 0.7491
Epoch 71/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5071 - acc: 0.7565 - val_loss: 0.5311 - val_acc: 0.7507
Epoch 72/100
390870/390870 [==============================] - 63s 161us/step - loss: 0.5076 - acc: 0.7561 - val_loss: 0.5203 - val_acc: 0.7509
Epoch 73/100
390870/390870 [==============================] - 63s 161us/step - loss: 0.5071 - acc: 0.7566 - val_loss: 0.5237 - val_acc: 0.7501
Epoch 74/100
390870/390870 [==============================] - 63s 161us/step - loss: 0.5090 - acc: 0.7554 - val_loss: 0.5276 - val_acc: 0.7463
Epoch 75/100
390870/390870 [==============================] - 63s 162us/step - loss: 0.5068 - acc: 0.7567 - val_loss: 0.5351 - val_acc: 0.7462
Epoch 76/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5062 - acc: 0.7570 - val_loss: 0.5302 - val_acc: 0.7477
Epoch 77/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5063 - acc: 0.7563 - val_loss: 0.5270 - val_acc: 0.7512
Epoch 78/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5060 - acc: 0.7569 - val_loss: 0.5287 - val_acc: 0.7438
Epoch 79/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5058 - acc: 0.7569 - val_loss: 0.5219 - val_acc: 0.7504
Epoch 80/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5059 - acc: 0.7569 - val_loss: 0.5373 - val_acc: 0.7476
Epoch 81/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5090 - acc: 0.7553 - val_loss: 0.5302 - val_acc: 0.7470
Epoch 82/100
390870/390870 [==============================] - 71s 182us/step - loss: 0.5064 - acc: 0.7566 - val_loss: 0.5293 - val_acc: 0.7498
Epoch 83/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5057 - acc: 0.7571 - val_loss: 0.5328 - val_acc: 0.7446
Epoch 84/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5052 - acc: 0.7575 - val_loss: 0.5275 - val_acc: 0.7510
Epoch 85/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5050 - acc: 0.7571 - val_loss: 0.5285 - val_acc: 0.7506
Epoch 86/100
390870/390870 [==============================] - 64s 162us/step - loss: 0.5054 - acc: 0.7573 - val_loss: 0.5313 - val_acc: 0.7499
Epoch 87/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5061 - acc: 0.7570 - val_loss: 0.5275 - val_acc: 0.7508
Epoch 88/100
390870/390870 [==============================] - 63s 161us/step - loss: 0.5086 - acc: 0.7554 - val_loss: 0.5396 - val_acc: 0.7454
Epoch 89/100
390870/390870 [==============================] - 63s 161us/step - loss: 0.5069 - acc: 0.7568 - val_loss: 0.5216 - val_acc: 0.7506
Epoch 90/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5083 - acc: 0.7551 - val_loss: 0.5246 - val_acc: 0.7483
Epoch 91/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5057 - acc: 0.7569 - val_loss: 0.5279 - val_acc: 0.7516
Epoch 92/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5062 - acc: 0.7564 - val_loss: 0.5280 - val_acc: 0.7495
Epoch 93/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5088 - acc: 0.7545 - val_loss: 0.5226 - val_acc: 0.7500
Epoch 94/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5078 - acc: 0.7561 - val_loss: 0.5331 - val_acc: 0.7493
Epoch 95/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5052 - acc: 0.7573 - val_loss: 0.5277 - val_acc: 0.7465
Epoch 96/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5046 - acc: 0.7574 - val_loss: 0.5355 - val_acc: 0.7459
Epoch 97/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5061 - acc: 0.7565 - val_loss: 0.5250 - val_acc: 0.7497
Epoch 98/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5034 - acc: 0.7584 - val_loss: 0.5233 - val_acc: 0.7513
Epoch 99/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5042 - acc: 0.7577 - val_loss: 0.5248 - val_acc: 0.7475
Epoch 100/100
390870/390870 [==============================] - 63s 161us/step - loss: 0.5036 - acc: 0.7579 - val_loss: 0.5342 - val_acc: 0.7499
Out[6]:
<keras.callbacks.History at 0x1d81f160d30>
In [7]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_all.h5")
In [8]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 57s 101us/step
[0.512360177343886, 0.7563101510431093]
In [9]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[9]:
(153993, 2774)
In [10]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[10]:
array([[1.],
       [0.],
       [1.],
       ...,
       [0.],
       [1.],
       [0.]], dtype=float32)
In [16]:
set(binary_prediction[:,0])
Out[16]:
{0.0, 1.0}
In [12]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.71      0.38      0.49     50930
           1       0.75      0.92      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.73      0.65      0.66    153993
weighted avg       0.74      0.74      0.72    153993

Accuracy for Deep Learning approach: 74.33649581474484
In [13]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[19270 31660]
 [ 7860 95203]]
In [14]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()

NN 2

alpha = 2

# hidden layers = 3

batch size = 100

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[3]:
(558386, 2774)
In [4]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 2

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
67
In [5]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [6]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 76s 194us/step - loss: 0.5945 - acc: 0.6981 - val_loss: 0.5504 - val_acc: 0.7257
Epoch 2/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5560 - acc: 0.7232 - val_loss: 0.5542 - val_acc: 0.7265
Epoch 3/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5459 - acc: 0.7304 - val_loss: 0.5586 - val_acc: 0.7329
Epoch 4/100
390870/390870 [==============================] - 73s 186us/step - loss: 0.5414 - acc: 0.7336 - val_loss: 0.5431 - val_acc: 0.7366
Epoch 5/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5378 - acc: 0.7364 - val_loss: 0.5399 - val_acc: 0.7331
Epoch 6/100
390870/390870 [==============================] - 68s 174us/step - loss: 0.5355 - acc: 0.7382 - val_loss: 0.5342 - val_acc: 0.7421
Epoch 7/100
390870/390870 [==============================] - 68s 174us/step - loss: 0.5335 - acc: 0.7393 - val_loss: 0.5430 - val_acc: 0.7335
Epoch 8/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5329 - acc: 0.7400 - val_loss: 0.5299 - val_acc: 0.7448
Epoch 9/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5309 - acc: 0.7415 - val_loss: 0.5312 - val_acc: 0.7450
Epoch 10/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5290 - acc: 0.7422 - val_loss: 0.5402 - val_acc: 0.7360
Epoch 11/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5283 - acc: 0.7437 - val_loss: 0.5288 - val_acc: 0.7454
Epoch 12/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5258 - acc: 0.7447 - val_loss: 0.5303 - val_acc: 0.7431
Epoch 13/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5260 - acc: 0.7453 - val_loss: 0.5309 - val_acc: 0.7432
Epoch 14/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5247 - acc: 0.7456 - val_loss: 0.5333 - val_acc: 0.7417
Epoch 15/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5248 - acc: 0.7458 - val_loss: 0.5262 - val_acc: 0.7480
Epoch 16/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5227 - acc: 0.7471 - val_loss: 0.5845 - val_acc: 0.7252
Epoch 17/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5222 - acc: 0.7472 - val_loss: 0.5288 - val_acc: 0.7439
Epoch 18/100
390870/390870 [==============================] - 70s 180us/step - loss: 0.5222 - acc: 0.7472 - val_loss: 0.5242 - val_acc: 0.7481
Epoch 19/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5211 - acc: 0.7481 - val_loss: 0.5295 - val_acc: 0.7480
Epoch 20/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5202 - acc: 0.7478 - val_loss: 0.5281 - val_acc: 0.7466
Epoch 21/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5202 - acc: 0.7481 - val_loss: 0.5229 - val_acc: 0.7487
Epoch 22/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5186 - acc: 0.7495 - val_loss: 0.5255 - val_acc: 0.7470
Epoch 23/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5183 - acc: 0.7493 - val_loss: 0.5234 - val_acc: 0.7496
Epoch 24/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5175 - acc: 0.7502 - val_loss: 0.5376 - val_acc: 0.7437
Epoch 25/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5175 - acc: 0.7507 - val_loss: 0.5282 - val_acc: 0.7466
Epoch 26/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5170 - acc: 0.7500 - val_loss: 0.5223 - val_acc: 0.7488
Epoch 27/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5204 - acc: 0.7479 - val_loss: 0.5407 - val_acc: 0.7352
Epoch 28/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5160 - acc: 0.7510 - val_loss: 0.5215 - val_acc: 0.7495
Epoch 29/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5156 - acc: 0.7510 - val_loss: 0.5270 - val_acc: 0.7460
Epoch 30/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5145 - acc: 0.7519 - val_loss: 0.5264 - val_acc: 0.7464
Epoch 31/100
390870/390870 [==============================] - 72s 183us/step - loss: 0.5155 - acc: 0.7515 - val_loss: 0.5299 - val_acc: 0.7429
Epoch 32/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5154 - acc: 0.7516 - val_loss: 0.5260 - val_acc: 0.7458
Epoch 33/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5160 - acc: 0.7506 - val_loss: 0.5369 - val_acc: 0.7389
Epoch 34/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5151 - acc: 0.7512 - val_loss: 0.5578 - val_acc: 0.7236
Epoch 35/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5131 - acc: 0.7525 - val_loss: 0.5299 - val_acc: 0.7433
Epoch 36/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5125 - acc: 0.7526 - val_loss: 0.5241 - val_acc: 0.7469
Epoch 37/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5139 - acc: 0.7525 - val_loss: 0.5235 - val_acc: 0.7478
Epoch 38/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5135 - acc: 0.7529 - val_loss: 0.5226 - val_acc: 0.7495
Epoch 39/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5153 - acc: 0.7518 - val_loss: 0.5248 - val_acc: 0.7476
Epoch 40/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5110 - acc: 0.7542 - val_loss: 0.5360 - val_acc: 0.7444
Epoch 41/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5110 - acc: 0.7541 - val_loss: 0.5234 - val_acc: 0.7478
Epoch 42/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5109 - acc: 0.7545 - val_loss: 0.5260 - val_acc: 0.7481
Epoch 43/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5105 - acc: 0.7537 - val_loss: 0.5239 - val_acc: 0.7488
Epoch 44/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5109 - acc: 0.7540 - val_loss: 0.5303 - val_acc: 0.7405
Epoch 45/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5103 - acc: 0.7549 - val_loss: 0.5251 - val_acc: 0.7485
Epoch 46/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5092 - acc: 0.7550 - val_loss: 0.5380 - val_acc: 0.7445
Epoch 47/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5096 - acc: 0.7551 - val_loss: 0.5319 - val_acc: 0.7433
Epoch 48/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5082 - acc: 0.7555 - val_loss: 0.5193 - val_acc: 0.7508
Epoch 49/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5084 - acc: 0.7553 - val_loss: 0.5240 - val_acc: 0.7484
Epoch 50/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5092 - acc: 0.7554 - val_loss: 0.5313 - val_acc: 0.7443
Epoch 51/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5087 - acc: 0.7554 - val_loss: 0.5237 - val_acc: 0.7494
Epoch 52/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5074 - acc: 0.7562 - val_loss: 0.5261 - val_acc: 0.7464
Epoch 53/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5076 - acc: 0.7557 - val_loss: 0.5212 - val_acc: 0.7500
Epoch 54/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5068 - acc: 0.7558 - val_loss: 0.5263 - val_acc: 0.7494
Epoch 55/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5080 - acc: 0.7559 - val_loss: 0.5306 - val_acc: 0.7435
Epoch 56/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5069 - acc: 0.7564 - val_loss: 0.5259 - val_acc: 0.7492
Epoch 57/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5067 - acc: 0.7562 - val_loss: 0.5279 - val_acc: 0.7469
Epoch 58/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5069 - acc: 0.7559 - val_loss: 0.5317 - val_acc: 0.7491
Epoch 59/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5068 - acc: 0.7570 - val_loss: 0.5292 - val_acc: 0.7458
Epoch 60/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5057 - acc: 0.7573 - val_loss: 0.5221 - val_acc: 0.7499
Epoch 61/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5063 - acc: 0.7571 - val_loss: 0.5270 - val_acc: 0.7494
Epoch 62/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5055 - acc: 0.7571 - val_loss: 0.5219 - val_acc: 0.7496
Epoch 63/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5046 - acc: 0.7573 - val_loss: 0.5254 - val_acc: 0.7471
Epoch 64/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5055 - acc: 0.7571 - val_loss: 0.5264 - val_acc: 0.7494
Epoch 65/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5039 - acc: 0.7576 - val_loss: 0.5297 - val_acc: 0.7470
Epoch 66/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5045 - acc: 0.7574 - val_loss: 0.5283 - val_acc: 0.7481
Epoch 67/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5056 - acc: 0.7572 - val_loss: 0.5276 - val_acc: 0.7473
Epoch 68/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5059 - acc: 0.7566 - val_loss: 0.5221 - val_acc: 0.7494
Epoch 69/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5039 - acc: 0.7581 - val_loss: 0.5297 - val_acc: 0.7480
Epoch 70/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5033 - acc: 0.7584 - val_loss: 0.5300 - val_acc: 0.7451
Epoch 71/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5034 - acc: 0.7576 - val_loss: 0.5224 - val_acc: 0.7500
Epoch 72/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5023 - acc: 0.7587 - val_loss: 0.5275 - val_acc: 0.7484
Epoch 73/100
390870/390870 [==============================] - 71s 182us/step - loss: 0.5031 - acc: 0.7584 - val_loss: 0.5269 - val_acc: 0.7485
Epoch 74/100
390870/390870 [==============================] - 68s 174us/step - loss: 0.5027 - acc: 0.7584 - val_loss: 0.5260 - val_acc: 0.7488
Epoch 75/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5018 - acc: 0.7597 - val_loss: 0.5272 - val_acc: 0.7478
Epoch 76/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5026 - acc: 0.7590 - val_loss: 0.5292 - val_acc: 0.7493
Epoch 77/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5021 - acc: 0.7591 - val_loss: 0.5394 - val_acc: 0.7473
Epoch 78/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5029 - acc: 0.7586 - val_loss: 0.5319 - val_acc: 0.7476
Epoch 79/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5027 - acc: 0.7591 - val_loss: 0.5461 - val_acc: 0.7434
Epoch 80/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5021 - acc: 0.7592 - val_loss: 0.5273 - val_acc: 0.7498
Epoch 81/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5015 - acc: 0.7594 - val_loss: 0.5773 - val_acc: 0.7094
Epoch 82/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5009 - acc: 0.7600 - val_loss: 0.5287 - val_acc: 0.7478
Epoch 83/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5015 - acc: 0.7593 - val_loss: 0.5263 - val_acc: 0.7478
Epoch 84/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5016 - acc: 0.7588 - val_loss: 0.5305 - val_acc: 0.7472
Epoch 85/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5015 - acc: 0.7598 - val_loss: 0.5262 - val_acc: 0.7498
Epoch 86/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5003 - acc: 0.7598 - val_loss: 0.5268 - val_acc: 0.7487
Epoch 87/100
390870/390870 [==============================] - 71s 181us/step - loss: 0.5003 - acc: 0.7599 - val_loss: 0.5320 - val_acc: 0.7493
Epoch 88/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5043 - acc: 0.7576 - val_loss: 0.5263 - val_acc: 0.7477
Epoch 89/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5013 - acc: 0.7593 - val_loss: 0.5292 - val_acc: 0.7459
Epoch 90/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5003 - acc: 0.7599 - val_loss: 0.5309 - val_acc: 0.7497
Epoch 91/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5030 - acc: 0.7584 - val_loss: 0.5248 - val_acc: 0.7496
Epoch 92/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5012 - acc: 0.7596 - val_loss: 0.5330 - val_acc: 0.7478
Epoch 93/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5009 - acc: 0.7593 - val_loss: 0.5318 - val_acc: 0.7475
Epoch 94/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.4989 - acc: 0.7607 - val_loss: 0.5308 - val_acc: 0.7487
Epoch 95/100
390870/390870 [==============================] - 65s 168us/step - loss: 0.4990 - acc: 0.7607 - val_loss: 0.5308 - val_acc: 0.7497
Epoch 96/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.4990 - acc: 0.7613 - val_loss: 0.5352 - val_acc: 0.7477
Epoch 97/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.4985 - acc: 0.7607 - val_loss: 0.5291 - val_acc: 0.7488
Epoch 98/100
390870/390870 [==============================] - 65s 168us/step - loss: 0.4983 - acc: 0.7615 - val_loss: 0.5259 - val_acc: 0.7461
Epoch 99/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.4999 - acc: 0.7602 - val_loss: 0.5438 - val_acc: 0.7421
Epoch 100/100
390870/390870 [==============================] - 68s 174us/step - loss: 0.4978 - acc: 0.7614 - val_loss: 0.5337 - val_acc: 0.7468
Out[6]:
<keras.callbacks.History at 0x1dcde2b5d68>
In [7]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_all2.h5")
In [8]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 51s 90us/step
[0.5102951304472344, 0.7552123441486025]
In [9]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[9]:
(153993, 2774)
In [10]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[10]:
array([[1.],
       [0.],
       [1.],
       ...,
       [1.],
       [1.],
       [1.]], dtype=float32)
In [11]:
set(binary_prediction[:,0])
Out[11]:
{0.0, 1.0}
In [12]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.73      0.33      0.46     50930
           1       0.74      0.94      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.73      0.64      0.64    153993
weighted avg       0.74      0.74      0.70    153993

Accuracy for Deep Learning approach: 73.82478424343964
In [13]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[16881 34049]
 [ 6259 96804]]
In [14]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()

NN 3

alpha = 7

# hidden layers = 3

batch size = 100

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[3]:
(558386, 2774)
In [4]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 7

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
29
In [5]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [6]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 82s 209us/step - loss: 0.5856 - acc: 0.7017 - val_loss: 0.5694 - val_acc: 0.7125
Epoch 2/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5533 - acc: 0.7251 - val_loss: 0.5492 - val_acc: 0.7293
Epoch 3/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5445 - acc: 0.7317 - val_loss: 0.5462 - val_acc: 0.7312
Epoch 4/100
390870/390870 [==============================] - 73s 187us/step - loss: 0.5410 - acc: 0.7339 - val_loss: 0.5389 - val_acc: 0.7389
Epoch 5/100
390870/390870 [==============================] - 68s 174us/step - loss: 0.5381 - acc: 0.7366 - val_loss: 0.5335 - val_acc: 0.7427
Epoch 6/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5354 - acc: 0.7376 - val_loss: 0.5365 - val_acc: 0.7390
Epoch 7/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5334 - acc: 0.7398 - val_loss: 0.5297 - val_acc: 0.7436
Epoch 8/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5324 - acc: 0.7405 - val_loss: 0.5356 - val_acc: 0.7383
Epoch 9/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5301 - acc: 0.7424 - val_loss: 0.5294 - val_acc: 0.7443
Epoch 10/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5293 - acc: 0.7425 - val_loss: 0.5266 - val_acc: 0.7462
Epoch 11/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5279 - acc: 0.7434 - val_loss: 0.5368 - val_acc: 0.7414
Epoch 12/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5273 - acc: 0.7445 - val_loss: 0.5278 - val_acc: 0.7459
Epoch 13/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5260 - acc: 0.7443 - val_loss: 0.5314 - val_acc: 0.7415
Epoch 14/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5261 - acc: 0.7449 - val_loss: 0.5342 - val_acc: 0.7440
Epoch 15/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5254 - acc: 0.7453 - val_loss: 0.5326 - val_acc: 0.7456
Epoch 16/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5241 - acc: 0.7460 - val_loss: 0.5312 - val_acc: 0.7421
Epoch 17/100
390870/390870 [==============================] - 67s 173us/step - loss: 0.5248 - acc: 0.7454 - val_loss: 0.5303 - val_acc: 0.7437
Epoch 18/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5236 - acc: 0.7460 - val_loss: 0.5257 - val_acc: 0.7455
Epoch 19/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5216 - acc: 0.7477 - val_loss: 0.5236 - val_acc: 0.7479
Epoch 20/100
390870/390870 [==============================] - 72s 184us/step - loss: 0.5218 - acc: 0.7474 - val_loss: 0.5258 - val_acc: 0.7471
Epoch 21/100
390870/390870 [==============================] - 73s 188us/step - loss: 0.5221 - acc: 0.7474 - val_loss: 0.5318 - val_acc: 0.7455
Epoch 22/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5209 - acc: 0.7478 - val_loss: 0.5259 - val_acc: 0.7465
Epoch 23/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5220 - acc: 0.7472 - val_loss: 0.5280 - val_acc: 0.7454
Epoch 24/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5215 - acc: 0.7482 - val_loss: 0.5226 - val_acc: 0.7485
Epoch 25/100
390870/390870 [==============================] - 66s 170us/step - loss: 0.5219 - acc: 0.7473 - val_loss: 0.5232 - val_acc: 0.7481
Epoch 26/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5193 - acc: 0.7486 - val_loss: 0.5268 - val_acc: 0.7462
Epoch 27/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5184 - acc: 0.7495 - val_loss: 0.5267 - val_acc: 0.7469
Epoch 28/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5189 - acc: 0.7495 - val_loss: 0.5243 - val_acc: 0.7468
Epoch 29/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5178 - acc: 0.7499 - val_loss: 0.5323 - val_acc: 0.7437
Epoch 30/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5177 - acc: 0.7498 - val_loss: 0.5235 - val_acc: 0.7464
Epoch 31/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5170 - acc: 0.7503 - val_loss: 0.5246 - val_acc: 0.7437
Epoch 32/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5164 - acc: 0.7512 - val_loss: 0.5275 - val_acc: 0.7441
Epoch 33/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5160 - acc: 0.7507 - val_loss: 0.5223 - val_acc: 0.7495
Epoch 34/100
390870/390870 [==============================] - 70s 180us/step - loss: 0.5156 - acc: 0.7514 - val_loss: 0.5273 - val_acc: 0.7480
Epoch 35/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5156 - acc: 0.7514 - val_loss: 0.5261 - val_acc: 0.7478
Epoch 36/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5147 - acc: 0.7515 - val_loss: 0.5208 - val_acc: 0.7484
Epoch 37/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5148 - acc: 0.7517 - val_loss: 0.5228 - val_acc: 0.7475
Epoch 38/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5145 - acc: 0.7511 - val_loss: 0.5331 - val_acc: 0.7420
Epoch 39/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5140 - acc: 0.7521 - val_loss: 0.5244 - val_acc: 0.7469
Epoch 40/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5134 - acc: 0.7525 - val_loss: 0.5236 - val_acc: 0.7451
Epoch 41/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5137 - acc: 0.7521 - val_loss: 0.5223 - val_acc: 0.7479
Epoch 42/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5142 - acc: 0.7516 - val_loss: 0.5231 - val_acc: 0.7477
Epoch 43/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5129 - acc: 0.7530 - val_loss: 0.5260 - val_acc: 0.7479
Epoch 44/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5140 - acc: 0.7520 - val_loss: 0.5233 - val_acc: 0.7477
Epoch 45/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5131 - acc: 0.7522 - val_loss: 0.5250 - val_acc: 0.7470
Epoch 46/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5142 - acc: 0.7520 - val_loss: 0.5305 - val_acc: 0.7462
Epoch 47/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5134 - acc: 0.7525 - val_loss: 0.5262 - val_acc: 0.7454
Epoch 48/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5118 - acc: 0.7535 - val_loss: 0.5210 - val_acc: 0.7506
Epoch 49/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5119 - acc: 0.7536 - val_loss: 0.5244 - val_acc: 0.7473
Epoch 50/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5121 - acc: 0.7537 - val_loss: 0.5229 - val_acc: 0.7498
Epoch 51/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5118 - acc: 0.7534 - val_loss: 0.5242 - val_acc: 0.7471
Epoch 52/100
390870/390870 [==============================] - 65s 165us/step - loss: 0.5109 - acc: 0.7543 - val_loss: 0.5290 - val_acc: 0.7461
Epoch 53/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5110 - acc: 0.7537 - val_loss: 0.5242 - val_acc: 0.7479
Epoch 54/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5114 - acc: 0.7533 - val_loss: 0.5283 - val_acc: 0.7450
Epoch 55/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5109 - acc: 0.7540 - val_loss: 0.5282 - val_acc: 0.7471
Epoch 56/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5105 - acc: 0.7545 - val_loss: 0.5238 - val_acc: 0.7469
Epoch 57/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5102 - acc: 0.7547 - val_loss: 0.5273 - val_acc: 0.7467
Epoch 58/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5102 - acc: 0.7545 - val_loss: 0.5283 - val_acc: 0.7459
Epoch 59/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5103 - acc: 0.7540 - val_loss: 0.5335 - val_acc: 0.7429
Epoch 60/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5104 - acc: 0.7539 - val_loss: 0.5281 - val_acc: 0.7426
Epoch 61/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5099 - acc: 0.7544 - val_loss: 0.5273 - val_acc: 0.7482
Epoch 62/100
390870/390870 [==============================] - 69s 178us/step - loss: 0.5114 - acc: 0.7540 - val_loss: 0.5351 - val_acc: 0.7420
Epoch 63/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5107 - acc: 0.7540 - val_loss: 0.5387 - val_acc: 0.7308
Epoch 64/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5099 - acc: 0.7547 - val_loss: 0.5230 - val_acc: 0.7477
Epoch 65/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5098 - acc: 0.7548 - val_loss: 0.5240 - val_acc: 0.7484
Epoch 66/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5092 - acc: 0.7552 - val_loss: 0.5368 - val_acc: 0.7424
Epoch 67/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5087 - acc: 0.7551 - val_loss: 0.5328 - val_acc: 0.7424
Epoch 68/100
390870/390870 [==============================] - 63s 162us/step - loss: 0.5097 - acc: 0.7542 - val_loss: 0.5264 - val_acc: 0.7458
Epoch 69/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5085 - acc: 0.7557 - val_loss: 0.5245 - val_acc: 0.7476
Epoch 70/100
390870/390870 [==============================] - 66s 170us/step - loss: 0.5086 - acc: 0.7556 - val_loss: 0.5275 - val_acc: 0.7456
Epoch 71/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5086 - acc: 0.7547 - val_loss: 0.5261 - val_acc: 0.7472
Epoch 72/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5085 - acc: 0.7556 - val_loss: 0.5221 - val_acc: 0.7496
Epoch 73/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5084 - acc: 0.7551 - val_loss: 0.5211 - val_acc: 0.7501
Epoch 74/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5088 - acc: 0.7554 - val_loss: 0.5263 - val_acc: 0.7473
Epoch 75/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5091 - acc: 0.7546 - val_loss: 0.5258 - val_acc: 0.7480
Epoch 76/100
390870/390870 [==============================] - 72s 185us/step - loss: 0.5073 - acc: 0.7561 - val_loss: 0.5252 - val_acc: 0.7488
Epoch 77/100
390870/390870 [==============================] - 68s 175us/step - loss: 0.5082 - acc: 0.7550 - val_loss: 0.5250 - val_acc: 0.7473
Epoch 78/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5072 - acc: 0.7559 - val_loss: 0.5376 - val_acc: 0.7452
Epoch 79/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5082 - acc: 0.7559 - val_loss: 0.5283 - val_acc: 0.7422
Epoch 80/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5081 - acc: 0.7551 - val_loss: 0.5314 - val_acc: 0.7409
Epoch 81/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5068 - acc: 0.7567 - val_loss: 0.5225 - val_acc: 0.7490
Epoch 82/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5067 - acc: 0.7562 - val_loss: 0.5258 - val_acc: 0.7491
Epoch 83/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5080 - acc: 0.7559 - val_loss: 0.5203 - val_acc: 0.7497
Epoch 84/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5062 - acc: 0.7567 - val_loss: 0.5245 - val_acc: 0.7490
Epoch 85/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5069 - acc: 0.7559 - val_loss: 0.5207 - val_acc: 0.7505
Epoch 86/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5067 - acc: 0.7566 - val_loss: 0.5253 - val_acc: 0.7466
Epoch 87/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5066 - acc: 0.7565 - val_loss: 0.5364 - val_acc: 0.7421
Epoch 88/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5062 - acc: 0.7562 - val_loss: 0.5209 - val_acc: 0.7493
Epoch 89/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5061 - acc: 0.7562 - val_loss: 0.5227 - val_acc: 0.7481
Epoch 90/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5058 - acc: 0.7566 - val_loss: 0.5244 - val_acc: 0.7472
Epoch 91/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5061 - acc: 0.7562 - val_loss: 0.5256 - val_acc: 0.7465
Epoch 92/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5057 - acc: 0.7563 - val_loss: 0.5331 - val_acc: 0.7471
Epoch 93/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5055 - acc: 0.7568 - val_loss: 0.5243 - val_acc: 0.7490
Epoch 94/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5056 - acc: 0.7565 - val_loss: 0.5261 - val_acc: 0.7478
Epoch 95/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5058 - acc: 0.7570 - val_loss: 0.5244 - val_acc: 0.7467
Epoch 96/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5049 - acc: 0.7573 - val_loss: 0.5215 - val_acc: 0.7513
Epoch 97/100
390870/390870 [==============================] - 64s 164us/step - loss: 0.5062 - acc: 0.7567 - val_loss: 0.5249 - val_acc: 0.7499
Epoch 98/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5066 - acc: 0.7565 - val_loss: 0.5254 - val_acc: 0.7505
Epoch 99/100
390870/390870 [==============================] - 64s 165us/step - loss: 0.5047 - acc: 0.7572 - val_loss: 0.5241 - val_acc: 0.7479
Epoch 100/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5046 - acc: 0.7572 - val_loss: 0.5229 - val_acc: 0.7483
Out[6]:
<keras.callbacks.History at 0x1e2ac885a58>
In [7]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_all3.h5")
In [8]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 54s 96us/step
[0.5092164321641403, 0.7539193317881909]
In [9]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[9]:
(153993, 2774)
In [10]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[10]:
array([[1.],
       [0.],
       [1.],
       ...,
       [0.],
       [1.],
       [0.]], dtype=float32)
In [11]:
set(binary_prediction[:,0])
Out[11]:
{0.0, 1.0}
In [12]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.70      0.38      0.49     50930
           1       0.75      0.92      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.73      0.65      0.66    153993
weighted avg       0.74      0.74      0.72    153993

Accuracy for Deep Learning approach: 74.23843940958356
In [13]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[19398 31532]
 [ 8139 94924]]
In [14]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [29]:
_del_all()

NN 4

alpha = 7

# hidden layers = 5

batch size = 100

In [5]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[5]:
(558386, 2774)
In [31]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 7

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
29
In [32]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [33]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 76s 194us/step - loss: 0.5860 - acc: 0.7003 - val_loss: 0.5558 - val_acc: 0.7246
Epoch 2/100
390870/390870 [==============================] - 68s 175us/step - loss: 0.5507 - acc: 0.7276 - val_loss: 0.5472 - val_acc: 0.7286
Epoch 3/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5437 - acc: 0.7324 - val_loss: 0.5416 - val_acc: 0.7332
Epoch 4/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5389 - acc: 0.7367 - val_loss: 0.5404 - val_acc: 0.7350
Epoch 5/100
390870/390870 [==============================] - 70s 180us/step - loss: 0.5355 - acc: 0.7380 - val_loss: 0.5455 - val_acc: 0.7335
Epoch 6/100
390870/390870 [==============================] - 82s 210us/step - loss: 0.5338 - acc: 0.7389 - val_loss: 0.5305 - val_acc: 0.7453
Epoch 7/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5318 - acc: 0.7409 - val_loss: 0.5410 - val_acc: 0.7380
Epoch 8/100
390870/390870 [==============================] - 69s 178us/step - loss: 0.5310 - acc: 0.7413 - val_loss: 0.5284 - val_acc: 0.7453
Epoch 9/100
390870/390870 [==============================] - 70s 178us/step - loss: 0.5294 - acc: 0.7424 - val_loss: 0.5343 - val_acc: 0.7369
Epoch 10/100
390870/390870 [==============================] - 69s 178us/step - loss: 0.5278 - acc: 0.7440 - val_loss: 0.5295 - val_acc: 0.7431
Epoch 11/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5272 - acc: 0.7438 - val_loss: 0.5369 - val_acc: 0.7398
Epoch 12/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5273 - acc: 0.7443 - val_loss: 0.5292 - val_acc: 0.7453
Epoch 13/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5255 - acc: 0.7451 - val_loss: 0.5319 - val_acc: 0.7404
Epoch 14/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5257 - acc: 0.7452 - val_loss: 0.5338 - val_acc: 0.7419
Epoch 15/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5247 - acc: 0.7460 - val_loss: 0.5309 - val_acc: 0.7463
Epoch 16/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5230 - acc: 0.7470 - val_loss: 0.5235 - val_acc: 0.7487
Epoch 17/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5225 - acc: 0.7472 - val_loss: 0.5274 - val_acc: 0.7448
Epoch 18/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5218 - acc: 0.7477 - val_loss: 0.5264 - val_acc: 0.7470
Epoch 19/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5210 - acc: 0.7483 - val_loss: 0.5228 - val_acc: 0.7489
Epoch 20/100
390870/390870 [==============================] - 74s 189us/step - loss: 0.5203 - acc: 0.7481 - val_loss: 0.5470 - val_acc: 0.7400
Epoch 21/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5203 - acc: 0.7486 - val_loss: 0.5242 - val_acc: 0.7466
Epoch 22/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5194 - acc: 0.7491 - val_loss: 0.5245 - val_acc: 0.7475
Epoch 23/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5192 - acc: 0.7496 - val_loss: 0.5237 - val_acc: 0.7484
Epoch 24/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5187 - acc: 0.7499 - val_loss: 0.5245 - val_acc: 0.7465
Epoch 25/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5189 - acc: 0.7495 - val_loss: 0.5257 - val_acc: 0.7472
Epoch 26/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5177 - acc: 0.7501 - val_loss: 0.5225 - val_acc: 0.7483
Epoch 27/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5175 - acc: 0.7505 - val_loss: 0.5334 - val_acc: 0.7459
Epoch 28/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5164 - acc: 0.7506 - val_loss: 0.5292 - val_acc: 0.7443
Epoch 29/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5173 - acc: 0.7504 - val_loss: 0.5240 - val_acc: 0.7487
Epoch 30/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5161 - acc: 0.7515 - val_loss: 0.5209 - val_acc: 0.7504
Epoch 31/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5160 - acc: 0.7512 - val_loss: 0.5322 - val_acc: 0.7455
Epoch 32/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5155 - acc: 0.7513 - val_loss: 0.5241 - val_acc: 0.7481
Epoch 33/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5147 - acc: 0.7520 - val_loss: 0.5242 - val_acc: 0.7483
Epoch 34/100
390870/390870 [==============================] - 72s 184us/step - loss: 0.5151 - acc: 0.7522 - val_loss: 0.5218 - val_acc: 0.7489
Epoch 35/100
390870/390870 [==============================] - 68s 175us/step - loss: 0.5145 - acc: 0.7526 - val_loss: 0.5315 - val_acc: 0.7455
Epoch 36/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5145 - acc: 0.7524 - val_loss: 0.5201 - val_acc: 0.7499
Epoch 37/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5143 - acc: 0.7519 - val_loss: 0.5216 - val_acc: 0.7484
Epoch 38/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5135 - acc: 0.7530 - val_loss: 0.5245 - val_acc: 0.7464
Epoch 39/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5133 - acc: 0.7526 - val_loss: 0.5236 - val_acc: 0.7473
Epoch 40/100
390870/390870 [==============================] - 68s 174us/step - loss: 0.5127 - acc: 0.7528 - val_loss: 0.5242 - val_acc: 0.7474
Epoch 41/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5129 - acc: 0.7524 - val_loss: 0.5211 - val_acc: 0.7483
Epoch 42/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5122 - acc: 0.7535 - val_loss: 0.5401 - val_acc: 0.7440
Epoch 43/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5121 - acc: 0.7537 - val_loss: 0.5244 - val_acc: 0.7481
Epoch 44/100
390870/390870 [==============================] - 68s 174us/step - loss: 0.5139 - acc: 0.7533 - val_loss: 0.5190 - val_acc: 0.7515
Epoch 45/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5117 - acc: 0.7536 - val_loss: 0.5238 - val_acc: 0.7503
Epoch 46/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5113 - acc: 0.7544 - val_loss: 0.5190 - val_acc: 0.7514
Epoch 47/100
390870/390870 [==============================] - 74s 188us/step - loss: 0.5112 - acc: 0.7541 - val_loss: 0.5202 - val_acc: 0.7507
Epoch 48/100
390870/390870 [==============================] - 68s 174us/step - loss: 0.5108 - acc: 0.7543 - val_loss: 0.5216 - val_acc: 0.7516
Epoch 49/100
390870/390870 [==============================] - 72s 183us/step - loss: 0.5107 - acc: 0.7543 - val_loss: 0.5286 - val_acc: 0.7482
Epoch 50/100
390870/390870 [==============================] - 70s 180us/step - loss: 0.5116 - acc: 0.7535 - val_loss: 0.5198 - val_acc: 0.7505
Epoch 51/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5111 - acc: 0.7538 - val_loss: 0.5203 - val_acc: 0.7493
Epoch 52/100
390870/390870 [==============================] - 71s 182us/step - loss: 0.5107 - acc: 0.7546 - val_loss: 0.5203 - val_acc: 0.7514ss: 0.5107 - acc: 
Epoch 53/100
390870/390870 [==============================] - 71s 183us/step - loss: 0.5107 - acc: 0.7540 - val_loss: 0.5216 - val_acc: 0.7491
Epoch 54/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5097 - acc: 0.7548 - val_loss: 0.5263 - val_acc: 0.7467
Epoch 55/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5091 - acc: 0.7549 - val_loss: 0.5264 - val_acc: 0.7455
Epoch 56/100
390870/390870 [==============================] - 70s 178us/step - loss: 0.5092 - acc: 0.7555 - val_loss: 0.5273 - val_acc: 0.7493
Epoch 57/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5091 - acc: 0.7556 - val_loss: 0.5221 - val_acc: 0.7493
Epoch 58/100
390870/390870 [==============================] - 70s 178us/step - loss: 0.5085 - acc: 0.7552 - val_loss: 0.5192 - val_acc: 0.7517
Epoch 59/100
390870/390870 [==============================] - 70s 178us/step - loss: 0.5089 - acc: 0.7554 - val_loss: 0.5267 - val_acc: 0.7464
Epoch 60/100
390870/390870 [==============================] - 75s 193us/step - loss: 0.5088 - acc: 0.7550 - val_loss: 0.5227 - val_acc: 0.7498
Epoch 61/100
390870/390870 [==============================] - 71s 181us/step - loss: 0.5082 - acc: 0.7555 - val_loss: 0.5191 - val_acc: 0.7517
Epoch 62/100
390870/390870 [==============================] - 67s 173us/step - loss: 0.5086 - acc: 0.7555 - val_loss: 0.5218 - val_acc: 0.7511
Epoch 63/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5088 - acc: 0.7551 - val_loss: 0.5211 - val_acc: 0.7500
Epoch 64/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5085 - acc: 0.7554 - val_loss: 0.5192 - val_acc: 0.7514
Epoch 65/100
390870/390870 [==============================] - 72s 184us/step - loss: 0.5078 - acc: 0.7552 - val_loss: 0.5199 - val_acc: 0.7503
Epoch 66/100
390870/390870 [==============================] - 70s 180us/step - loss: 0.5080 - acc: 0.7558 - val_loss: 0.5218 - val_acc: 0.7495
Epoch 67/100
390870/390870 [==============================] - 74s 189us/step - loss: 0.5084 - acc: 0.7552 - val_loss: 0.5231 - val_acc: 0.7485
Epoch 68/100
390870/390870 [==============================] - 70s 178us/step - loss: 0.5076 - acc: 0.7558 - val_loss: 0.5306 - val_acc: 0.7455
Epoch 69/100
390870/390870 [==============================] - 73s 187us/step - loss: 0.5073 - acc: 0.7560 - val_loss: 0.5270 - val_acc: 0.7504
Epoch 70/100
390870/390870 [==============================] - 73s 187us/step - loss: 0.5071 - acc: 0.7564 - val_loss: 0.5233 - val_acc: 0.7502
Epoch 71/100
390870/390870 [==============================] - 73s 186us/step - loss: 0.5076 - acc: 0.7558 - val_loss: 0.5326 - val_acc: 0.7471
Epoch 72/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5070 - acc: 0.7566 - val_loss: 0.5188 - val_acc: 0.7518
Epoch 73/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5071 - acc: 0.7562 - val_loss: 0.5230 - val_acc: 0.7499
Epoch 74/100
390870/390870 [==============================] - 70s 178us/step - loss: 0.5064 - acc: 0.7566 - val_loss: 0.5304 - val_acc: 0.7472
Epoch 75/100
390870/390870 [==============================] - 74s 190us/step - loss: 0.5064 - acc: 0.7567 - val_loss: 0.5190 - val_acc: 0.7519
Epoch 76/100
390870/390870 [==============================] - 70s 180us/step - loss: 0.5066 - acc: 0.7568 - val_loss: 0.5248 - val_acc: 0.7470
Epoch 77/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5066 - acc: 0.7563 - val_loss: 0.5291 - val_acc: 0.7492
Epoch 78/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5059 - acc: 0.7571 - val_loss: 0.5242 - val_acc: 0.7491
Epoch 79/100
390870/390870 [==============================] - 69s 178us/step - loss: 0.5058 - acc: 0.7569 - val_loss: 0.5234 - val_acc: 0.7493
Epoch 80/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5057 - acc: 0.7571 - val_loss: 0.5217 - val_acc: 0.7515
Epoch 81/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5058 - acc: 0.7571 - val_loss: 0.5193 - val_acc: 0.7515
Epoch 82/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5056 - acc: 0.7568 - val_loss: 0.5207 - val_acc: 0.7497
Epoch 83/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5052 - acc: 0.7577 - val_loss: 0.5324 - val_acc: 0.7428
Epoch 84/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5056 - acc: 0.7568 - val_loss: 0.5264 - val_acc: 0.7496
Epoch 85/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5056 - acc: 0.7573 - val_loss: 0.5225 - val_acc: 0.7511
Epoch 86/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5046 - acc: 0.7576 - val_loss: 0.5194 - val_acc: 0.7522
Epoch 87/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5052 - acc: 0.7569 - val_loss: 0.5229 - val_acc: 0.7509
Epoch 88/100
390870/390870 [==============================] - 72s 184us/step - loss: 0.5056 - acc: 0.7569 - val_loss: 0.5284 - val_acc: 0.7495
Epoch 89/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5052 - acc: 0.7573 - val_loss: 0.5270 - val_acc: 0.7508
Epoch 90/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5052 - acc: 0.7574 - val_loss: 0.5216 - val_acc: 0.7515
Epoch 91/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5047 - acc: 0.7582 - val_loss: 0.5327 - val_acc: 0.7487
Epoch 92/100
390870/390870 [==============================] - 66s 170us/step - loss: 0.5042 - acc: 0.7577 - val_loss: 0.5241 - val_acc: 0.7491
Epoch 93/100
390870/390870 [==============================] - 98s 251us/step - loss: 0.5041 - acc: 0.7578 - val_loss: 0.5283 - val_acc: 0.7463
Epoch 94/100
390870/390870 [==============================] - 152s 389us/step - loss: 0.5042 - acc: 0.7578 - val_loss: 0.5267 - val_acc: 0.7499
Epoch 95/100
390870/390870 [==============================] - 173s 442us/step - loss: 0.5038 - acc: 0.7579 - val_loss: 0.5319 - val_acc: 0.7487
Epoch 96/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5040 - acc: 0.7580 - val_loss: 0.5238 - val_acc: 0.7511
Epoch 97/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5039 - acc: 0.7580 - val_loss: 0.5258 - val_acc: 0.7498
Epoch 98/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5035 - acc: 0.7583 - val_loss: 0.5252 - val_acc: 0.7506
Epoch 99/100
390870/390870 [==============================] - 73s 188us/step - loss: 0.5043 - acc: 0.7576 - val_loss: 0.5237 - val_acc: 0.7501
Epoch 100/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5038 - acc: 0.7580 - val_loss: 0.5238 - val_acc: 0.7511
Out[33]:
<keras.callbacks.History at 0x1e2e3f38630>
In [34]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_all5.h5")
In [6]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 53s 95us/step
[0.5069767637241182, 0.757692707194954]
In [7]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[7]:
(153993, 2774)
In [8]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[8]:
array([[1.],
       [0.],
       [1.],
       ...,
       [0.],
       [1.],
       [0.]], dtype=float32)
In [9]:
set(binary_prediction[:,0])
Out[9]:
{0.0, 1.0}
In [10]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.69      0.43      0.53     50930
           1       0.76      0.90      0.83    103063

    accuracy                           0.75    153993
   macro avg       0.72      0.67      0.68    153993
weighted avg       0.74      0.75      0.73    153993

Accuracy for Deep Learning approach: 74.6254699888956
In [11]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[21761 29169]
 [ 9906 93157]]
In [12]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [13]:
_del_all()

NN 5

alpha = 7

# hidden layers = 5

batch size = 500

In [14]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[14]:
(558386, 2774)
In [15]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 7

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
29
In [16]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [17]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 500, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 57s 145us/step - loss: 0.6050 - acc: 0.6886 - val_loss: 0.5680 - val_acc: 0.7174
Epoch 2/100
390870/390870 [==============================] - 54s 138us/step - loss: 0.5585 - acc: 0.7214 - val_loss: 0.5550 - val_acc: 0.7256
Epoch 3/100
390870/390870 [==============================] - 53s 136us/step - loss: 0.5464 - acc: 0.7309 - val_loss: 0.5400 - val_acc: 0.7387
Epoch 4/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5406 - acc: 0.7345 - val_loss: 0.5605 - val_acc: 0.7164
Epoch 5/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5363 - acc: 0.7370 - val_loss: 0.5421 - val_acc: 0.7324
Epoch 6/100
390870/390870 [==============================] - 54s 137us/step - loss: 0.5341 - acc: 0.7390 - val_loss: 0.5412 - val_acc: 0.7340
Epoch 7/100
390870/390870 [==============================] - 53s 136us/step - loss: 0.5315 - acc: 0.7410 - val_loss: 0.5354 - val_acc: 0.7407
Epoch 8/100
390870/390870 [==============================] - 58s 149us/step - loss: 0.5304 - acc: 0.7415 - val_loss: 0.5401 - val_acc: 0.7375
Epoch 9/100
390870/390870 [==============================] - 54s 138us/step - loss: 0.5303 - acc: 0.7414 - val_loss: 0.5321 - val_acc: 0.7445
Epoch 10/100
390870/390870 [==============================] - 53s 134us/step - loss: 0.5277 - acc: 0.7431 - val_loss: 0.5394 - val_acc: 0.7354
Epoch 11/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5261 - acc: 0.7448 - val_loss: 0.5291 - val_acc: 0.7442
Epoch 12/100
390870/390870 [==============================] - 54s 139us/step - loss: 0.5260 - acc: 0.7445 - val_loss: 0.5308 - val_acc: 0.7437
Epoch 13/100
390870/390870 [==============================] - 53s 136us/step - loss: 0.5248 - acc: 0.7453 - val_loss: 0.5329 - val_acc: 0.7410
Epoch 14/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5240 - acc: 0.7463 - val_loss: 0.5294 - val_acc: 0.7449
Epoch 15/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5226 - acc: 0.7468 - val_loss: 0.5345 - val_acc: 0.7397
Epoch 16/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5220 - acc: 0.7478 - val_loss: 0.5270 - val_acc: 0.7451
Epoch 17/100
390870/390870 [==============================] - 54s 139us/step - loss: 0.5216 - acc: 0.7480 - val_loss: 0.5347 - val_acc: 0.7375
Epoch 18/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5216 - acc: 0.7479 - val_loss: 0.5338 - val_acc: 0.7427
Epoch 19/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5198 - acc: 0.7488 - val_loss: 0.5268 - val_acc: 0.7446
Epoch 20/100
390870/390870 [==============================] - 53s 136us/step - loss: 0.5201 - acc: 0.7490 - val_loss: 0.5260 - val_acc: 0.7464
Epoch 21/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5199 - acc: 0.7491 - val_loss: 0.5272 - val_acc: 0.7472
Epoch 22/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5187 - acc: 0.7503 - val_loss: 0.5312 - val_acc: 0.7460
Epoch 23/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5182 - acc: 0.7498 - val_loss: 0.5255 - val_acc: 0.7470
Epoch 24/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5185 - acc: 0.7500 - val_loss: 0.5231 - val_acc: 0.7478
Epoch 25/100
390870/390870 [==============================] - 55s 142us/step - loss: 0.5164 - acc: 0.7515 - val_loss: 0.5231 - val_acc: 0.7482
Epoch 26/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5180 - acc: 0.7499 - val_loss: 0.5375 - val_acc: 0.7416
Epoch 27/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5158 - acc: 0.7514 - val_loss: 0.5224 - val_acc: 0.7485
Epoch 28/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5167 - acc: 0.7513 - val_loss: 0.5280 - val_acc: 0.7449
Epoch 29/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5155 - acc: 0.7518 - val_loss: 0.5221 - val_acc: 0.7489
Epoch 30/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5144 - acc: 0.7524 - val_loss: 0.5238 - val_acc: 0.7487
Epoch 31/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5143 - acc: 0.7524 - val_loss: 0.5236 - val_acc: 0.7475
Epoch 32/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5152 - acc: 0.7516 - val_loss: 0.5229 - val_acc: 0.7478
Epoch 33/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5146 - acc: 0.7525 - val_loss: 0.5210 - val_acc: 0.7502
Epoch 34/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5137 - acc: 0.7525 - val_loss: 0.5263 - val_acc: 0.7480
Epoch 35/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5141 - acc: 0.7520 - val_loss: 0.5218 - val_acc: 0.7504
Epoch 36/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5126 - acc: 0.7537 - val_loss: 0.5221 - val_acc: 0.7487
Epoch 37/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5128 - acc: 0.7533 - val_loss: 0.5234 - val_acc: 0.7476
Epoch 38/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5128 - acc: 0.7529 - val_loss: 0.5282 - val_acc: 0.7457
Epoch 39/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5127 - acc: 0.7532 - val_loss: 0.5275 - val_acc: 0.7450
Epoch 40/100
390870/390870 [==============================] - 51s 131us/step - loss: 0.5123 - acc: 0.7532 - val_loss: 0.5284 - val_acc: 0.7454
Epoch 41/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5106 - acc: 0.7540 - val_loss: 0.5223 - val_acc: 0.7506
Epoch 42/100
390870/390870 [==============================] - 53s 136us/step - loss: 0.5110 - acc: 0.7539 - val_loss: 0.5356 - val_acc: 0.7429
Epoch 43/100
390870/390870 [==============================] - 53s 135us/step - loss: 0.5106 - acc: 0.7546 - val_loss: 0.5221 - val_acc: 0.7499
Epoch 44/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5114 - acc: 0.7540 - val_loss: 0.5223 - val_acc: 0.7497
Epoch 45/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5099 - acc: 0.7547 - val_loss: 0.5279 - val_acc: 0.7487
Epoch 46/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5097 - acc: 0.7551 - val_loss: 0.5227 - val_acc: 0.7501
Epoch 47/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5096 - acc: 0.7554 - val_loss: 0.5270 - val_acc: 0.7467
Epoch 48/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5095 - acc: 0.7552 - val_loss: 0.5200 - val_acc: 0.7490
Epoch 49/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5096 - acc: 0.7549 - val_loss: 0.5226 - val_acc: 0.7484
Epoch 50/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5095 - acc: 0.7551 - val_loss: 0.5250 - val_acc: 0.7493
Epoch 51/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5080 - acc: 0.7559 - val_loss: 0.5202 - val_acc: 0.7504
Epoch 52/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5088 - acc: 0.7552 - val_loss: 0.5212 - val_acc: 0.7499
Epoch 53/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5083 - acc: 0.7554 - val_loss: 0.5253 - val_acc: 0.7473
Epoch 54/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5082 - acc: 0.7559 - val_loss: 0.5205 - val_acc: 0.7509
Epoch 55/100
390870/390870 [==============================] - 51s 131us/step - loss: 0.5081 - acc: 0.7554 - val_loss: 0.5253 - val_acc: 0.7487
Epoch 56/100
390870/390870 [==============================] - 51s 129us/step - loss: 0.5080 - acc: 0.7555 - val_loss: 0.5207 - val_acc: 0.7509
Epoch 57/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5069 - acc: 0.7568 - val_loss: 0.5214 - val_acc: 0.7497
Epoch 58/100
390870/390870 [==============================] - 51s 131us/step - loss: 0.5073 - acc: 0.7564 - val_loss: 0.5199 - val_acc: 0.7509
Epoch 59/100
390870/390870 [==============================] - 77s 198us/step - loss: 0.5079 - acc: 0.7558 - val_loss: 0.5295 - val_acc: 0.7464
Epoch 60/100
390870/390870 [==============================] - 54s 138us/step - loss: 0.5064 - acc: 0.7564 - val_loss: 0.5291 - val_acc: 0.7469
Epoch 61/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5065 - acc: 0.7568 - val_loss: 0.5185 - val_acc: 0.7520
Epoch 62/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5065 - acc: 0.7568 - val_loss: 0.5221 - val_acc: 0.7496
Epoch 63/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5059 - acc: 0.7571 - val_loss: 0.5276 - val_acc: 0.7488
Epoch 64/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5057 - acc: 0.7574 - val_loss: 0.5204 - val_acc: 0.7505
Epoch 65/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5055 - acc: 0.7577 - val_loss: 0.5293 - val_acc: 0.7427
Epoch 66/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5046 - acc: 0.7582 - val_loss: 0.5190 - val_acc: 0.7508
Epoch 67/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5052 - acc: 0.7575 - val_loss: 0.5196 - val_acc: 0.7517
Epoch 68/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5049 - acc: 0.7580 - val_loss: 0.5246 - val_acc: 0.7480
Epoch 69/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5054 - acc: 0.7577 - val_loss: 0.5297 - val_acc: 0.7441
Epoch 70/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5044 - acc: 0.7576 - val_loss: 0.5272 - val_acc: 0.7490
Epoch 71/100
390870/390870 [==============================] - 52s 132us/step - loss: 0.5058 - acc: 0.7574 - val_loss: 0.5225 - val_acc: 0.7481
Epoch 72/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5039 - acc: 0.7583 - val_loss: 0.5221 - val_acc: 0.7486
Epoch 73/100
390870/390870 [==============================] - 60s 153us/step - loss: 0.5041 - acc: 0.7584 - val_loss: 0.5193 - val_acc: 0.7506
Epoch 74/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5035 - acc: 0.7582 - val_loss: 0.5223 - val_acc: 0.7491
Epoch 75/100
390870/390870 [==============================] - 51s 129us/step - loss: 0.5033 - acc: 0.7588 - val_loss: 0.5227 - val_acc: 0.7506
Epoch 76/100
390870/390870 [==============================] - 51s 129us/step - loss: 0.5033 - acc: 0.7582 - val_loss: 0.5282 - val_acc: 0.7470
Epoch 77/100
390870/390870 [==============================] - 55s 141us/step - loss: 0.5042 - acc: 0.7580 - val_loss: 0.5244 - val_acc: 0.7499
Epoch 78/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5034 - acc: 0.7586 - val_loss: 0.5261 - val_acc: 0.7466
Epoch 79/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5027 - acc: 0.7595 - val_loss: 0.5263 - val_acc: 0.7494
Epoch 80/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5028 - acc: 0.7592 - val_loss: 0.5273 - val_acc: 0.7460
Epoch 81/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5022 - acc: 0.7593 - val_loss: 0.5257 - val_acc: 0.7488
Epoch 82/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5025 - acc: 0.7589 - val_loss: 0.5220 - val_acc: 0.7502
Epoch 83/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5019 - acc: 0.7593 - val_loss: 0.5270 - val_acc: 0.7478
Epoch 84/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5023 - acc: 0.7594 - val_loss: 0.5259 - val_acc: 0.7492
Epoch 85/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5018 - acc: 0.7592 - val_loss: 0.5267 - val_acc: 0.7498
Epoch 86/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5021 - acc: 0.7591 - val_loss: 0.5302 - val_acc: 0.7445
Epoch 87/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5013 - acc: 0.7597 - val_loss: 0.5209 - val_acc: 0.7512
Epoch 88/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5010 - acc: 0.7596 - val_loss: 0.5268 - val_acc: 0.7466
Epoch 89/100
390870/390870 [==============================] - 51s 131us/step - loss: 0.5019 - acc: 0.7594 - val_loss: 0.5236 - val_acc: 0.7510
Epoch 90/100
390870/390870 [==============================] - 51s 129us/step - loss: 0.5012 - acc: 0.7598 - val_loss: 0.5240 - val_acc: 0.7489
Epoch 91/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5016 - acc: 0.7598 - val_loss: 0.5236 - val_acc: 0.7516
Epoch 92/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5013 - acc: 0.7596 - val_loss: 0.5237 - val_acc: 0.7501
Epoch 93/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5009 - acc: 0.7597 - val_loss: 0.5250 - val_acc: 0.7505
Epoch 94/100
390870/390870 [==============================] - 55s 142us/step - loss: 0.5003 - acc: 0.7599 - val_loss: 0.5223 - val_acc: 0.7501
Epoch 95/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5002 - acc: 0.7598 - val_loss: 0.5274 - val_acc: 0.7493
Epoch 96/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5002 - acc: 0.7599 - val_loss: 0.5247 - val_acc: 0.7497
Epoch 97/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5000 - acc: 0.7601 - val_loss: 0.5289 - val_acc: 0.7498
Epoch 98/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.4997 - acc: 0.7612 - val_loss: 0.5273 - val_acc: 0.7504
Epoch 99/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.4997 - acc: 0.7605 - val_loss: 0.5269 - val_acc: 0.7495
Epoch 100/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.4994 - acc: 0.7607 - val_loss: 0.5307 - val_acc: 0.7475
Out[17]:
<keras.callbacks.History at 0x28588237f28>
In [18]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_all6.h5")
In [19]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 46s 83us/step
[0.5109344036041277, 0.7546177733688839]
In [20]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[20]:
(153993, 2774)
In [21]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[21]:
array([[1.],
       [0.],
       [1.],
       ...,
       [1.],
       [1.],
       [1.]], dtype=float32)
In [22]:
set(binary_prediction[:,0])
Out[22]:
{0.0, 1.0}
In [23]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.73      0.34      0.46     50930
           1       0.74      0.94      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.74      0.64      0.65    153993
weighted avg       0.74      0.74      0.71    153993

Accuracy for Deep Learning approach: 73.97803796276455
In [24]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[17292 33638]
 [ 6434 96629]]
In [25]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [26]:
_del_all()

NN 6

alpha = 6

# hidden layers = 5

batch size = 100

In [27]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[27]:
(558386, 2774)
In [28]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 6

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
34
In [29]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [30]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 71s 183us/step - loss: 0.5860 - acc: 0.7012 - val_loss: 0.5567 - val_acc: 0.7211
Epoch 2/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5531 - acc: 0.7259 - val_loss: 0.5461 - val_acc: 0.7325
Epoch 3/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5453 - acc: 0.7313 - val_loss: 0.5470 - val_acc: 0.7292
Epoch 4/100
390870/390870 [==============================] - 65s 167us/step - loss: 0.5406 - acc: 0.7342 - val_loss: 0.5556 - val_acc: 0.7176
Epoch 5/100
390870/390870 [==============================] - 65s 166us/step - loss: 0.5377 - acc: 0.7368 - val_loss: 0.5365 - val_acc: 0.7417
Epoch 6/100
390870/390870 [==============================] - 70s 179us/step - loss: 0.5350 - acc: 0.7385 - val_loss: 0.5380 - val_acc: 0.7371
Epoch 7/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5324 - acc: 0.7404 - val_loss: 0.5296 - val_acc: 0.7453
Epoch 8/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5307 - acc: 0.7412 - val_loss: 0.5569 - val_acc: 0.7224
Epoch 9/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5291 - acc: 0.7430 - val_loss: 0.5283 - val_acc: 0.7449
Epoch 10/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5279 - acc: 0.7433 - val_loss: 0.5364 - val_acc: 0.7405
Epoch 11/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5261 - acc: 0.7454 - val_loss: 0.5308 - val_acc: 0.7451
Epoch 12/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5257 - acc: 0.7454 - val_loss: 0.5291 - val_acc: 0.7446
Epoch 13/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5246 - acc: 0.7458 - val_loss: 0.5282 - val_acc: 0.7453
Epoch 14/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5236 - acc: 0.7467 - val_loss: 0.5402 - val_acc: 0.7408
Epoch 15/100
390870/390870 [==============================] - 66s 170us/step - loss: 0.5226 - acc: 0.7469 - val_loss: 0.5656 - val_acc: 0.7069
Epoch 16/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5218 - acc: 0.7479 - val_loss: 0.5318 - val_acc: 0.7428
Epoch 17/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5214 - acc: 0.7483 - val_loss: 0.5306 - val_acc: 0.7426
Epoch 18/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5205 - acc: 0.7484 - val_loss: 0.5240 - val_acc: 0.7477
Epoch 19/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5203 - acc: 0.7486 - val_loss: 0.5261 - val_acc: 0.7462
Epoch 20/100
390870/390870 [==============================] - 72s 183us/step - loss: 0.5192 - acc: 0.7494 - val_loss: 0.5250 - val_acc: 0.7462
Epoch 21/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5190 - acc: 0.7495 - val_loss: 0.5293 - val_acc: 0.7428
Epoch 22/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5178 - acc: 0.7504 - val_loss: 0.5246 - val_acc: 0.7484
Epoch 23/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5172 - acc: 0.7507 - val_loss: 0.5313 - val_acc: 0.7447
Epoch 24/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5168 - acc: 0.7506 - val_loss: 0.5272 - val_acc: 0.7495
Epoch 25/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5168 - acc: 0.7504 - val_loss: 0.5352 - val_acc: 0.7368
Epoch 26/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5159 - acc: 0.7516 - val_loss: 0.5378 - val_acc: 0.7412
Epoch 27/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5163 - acc: 0.7510 - val_loss: 0.5219 - val_acc: 0.7488
Epoch 28/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5150 - acc: 0.7519 - val_loss: 0.5265 - val_acc: 0.7484
Epoch 29/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5142 - acc: 0.7520 - val_loss: 0.5203 - val_acc: 0.7489
Epoch 30/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5146 - acc: 0.7520 - val_loss: 0.5282 - val_acc: 0.7444
Epoch 31/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5138 - acc: 0.7522 - val_loss: 0.5228 - val_acc: 0.7481
Epoch 32/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5147 - acc: 0.7522 - val_loss: 0.5244 - val_acc: 0.7488
Epoch 33/100
390870/390870 [==============================] - 71s 181us/step - loss: 0.5133 - acc: 0.7528 - val_loss: 0.5283 - val_acc: 0.7472
Epoch 34/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5128 - acc: 0.7530 - val_loss: 0.5404 - val_acc: 0.7392
Epoch 35/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5131 - acc: 0.7530 - val_loss: 0.5231 - val_acc: 0.7475
Epoch 36/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5124 - acc: 0.7530 - val_loss: 0.5216 - val_acc: 0.7472
Epoch 37/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5121 - acc: 0.7531 - val_loss: 0.5203 - val_acc: 0.7502
Epoch 38/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5122 - acc: 0.7538 - val_loss: 0.5307 - val_acc: 0.7488
Epoch 39/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5108 - acc: 0.7542 - val_loss: 0.5241 - val_acc: 0.7482
Epoch 40/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5113 - acc: 0.7533 - val_loss: 0.5227 - val_acc: 0.7492
Epoch 41/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5105 - acc: 0.7546 - val_loss: 0.5185 - val_acc: 0.7502
Epoch 42/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5106 - acc: 0.7542 - val_loss: 0.5242 - val_acc: 0.7478
Epoch 43/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5099 - acc: 0.7543 - val_loss: 0.5295 - val_acc: 0.7486
Epoch 44/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5097 - acc: 0.7545 - val_loss: 0.5188 - val_acc: 0.7514
Epoch 45/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5098 - acc: 0.7547 - val_loss: 0.5262 - val_acc: 0.7463
Epoch 46/100
390870/390870 [==============================] - 66s 170us/step - loss: 0.5092 - acc: 0.7556 - val_loss: 0.5207 - val_acc: 0.7503
Epoch 47/100
390870/390870 [==============================] - 72s 183us/step - loss: 0.5089 - acc: 0.7551 - val_loss: 0.5212 - val_acc: 0.7496
Epoch 48/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5089 - acc: 0.7551 - val_loss: 0.5202 - val_acc: 0.7500
Epoch 49/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5085 - acc: 0.7552 - val_loss: 0.5279 - val_acc: 0.7454
Epoch 50/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5083 - acc: 0.7557 - val_loss: 0.5201 - val_acc: 0.7491
Epoch 51/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5079 - acc: 0.7560 - val_loss: 0.5370 - val_acc: 0.7446
Epoch 52/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5075 - acc: 0.7559 - val_loss: 0.5251 - val_acc: 0.7470
Epoch 53/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5079 - acc: 0.7562 - val_loss: 0.5235 - val_acc: 0.7495
Epoch 54/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5073 - acc: 0.7567 - val_loss: 0.5264 - val_acc: 0.7460
Epoch 55/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5070 - acc: 0.7565 - val_loss: 0.5272 - val_acc: 0.7467
Epoch 56/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5071 - acc: 0.7565 - val_loss: 0.5214 - val_acc: 0.7506
Epoch 57/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5068 - acc: 0.7563 - val_loss: 0.5289 - val_acc: 0.7469
Epoch 58/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5070 - acc: 0.7561 - val_loss: 0.5300 - val_acc: 0.7478
Epoch 59/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5062 - acc: 0.7567 - val_loss: 0.5801 - val_acc: 0.7259
Epoch 60/100
390870/390870 [==============================] - 70s 180us/step - loss: 0.5056 - acc: 0.7570 - val_loss: 0.5222 - val_acc: 0.7502
Epoch 61/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5059 - acc: 0.7568 - val_loss: 0.5229 - val_acc: 0.7499
Epoch 62/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5056 - acc: 0.7569 - val_loss: 0.5210 - val_acc: 0.7501
Epoch 63/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5055 - acc: 0.7574 - val_loss: 0.5225 - val_acc: 0.7503
Epoch 64/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5059 - acc: 0.7569 - val_loss: 0.5217 - val_acc: 0.7521
Epoch 65/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5050 - acc: 0.7572 - val_loss: 0.5291 - val_acc: 0.7444
Epoch 66/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5047 - acc: 0.7576 - val_loss: 0.5266 - val_acc: 0.7461
Epoch 67/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5047 - acc: 0.7576 - val_loss: 0.5246 - val_acc: 0.7491
Epoch 68/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5047 - acc: 0.7573 - val_loss: 0.5227 - val_acc: 0.7500
Epoch 69/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5041 - acc: 0.7581 - val_loss: 0.5244 - val_acc: 0.7497
Epoch 70/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5041 - acc: 0.7578 - val_loss: 0.5185 - val_acc: 0.7517
Epoch 71/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5045 - acc: 0.7575 - val_loss: 0.5247 - val_acc: 0.7488
Epoch 72/100
390870/390870 [==============================] - 66s 168us/step - loss: 0.5039 - acc: 0.7584 - val_loss: 0.5315 - val_acc: 0.7481
Epoch 73/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5038 - acc: 0.7583 - val_loss: 0.5238 - val_acc: 0.7498
Epoch 74/100
390870/390870 [==============================] - 72s 185us/step - loss: 0.5034 - acc: 0.7580 - val_loss: 0.5238 - val_acc: 0.7497
Epoch 75/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5033 - acc: 0.7587 - val_loss: 0.5217 - val_acc: 0.7495
Epoch 76/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5028 - acc: 0.7582 - val_loss: 0.5268 - val_acc: 0.7469
Epoch 77/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5033 - acc: 0.7584 - val_loss: 0.5227 - val_acc: 0.7494
Epoch 78/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5028 - acc: 0.7586 - val_loss: 0.5263 - val_acc: 0.7477
Epoch 79/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5022 - acc: 0.7589 - val_loss: 0.5261 - val_acc: 0.7477
Epoch 80/100
390870/390870 [==============================] - 67s 170us/step - loss: 0.5023 - acc: 0.7590 - val_loss: 0.5231 - val_acc: 0.7491
Epoch 81/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5022 - acc: 0.7592 - val_loss: 0.5223 - val_acc: 0.7512
Epoch 82/100
390870/390870 [==============================] - 66s 170us/step - loss: 0.5018 - acc: 0.7593 - val_loss: 0.5234 - val_acc: 0.7496
Epoch 83/100
390870/390870 [==============================] - 173s 443us/step - loss: 0.5016 - acc: 0.7592 - val_loss: 0.5249 - val_acc: 0.7528
Epoch 84/100
390870/390870 [==============================] - 69s 176us/step - loss: 0.5019 - acc: 0.7590 - val_loss: 0.5272 - val_acc: 0.7499
Epoch 85/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5018 - acc: 0.7591 - val_loss: 0.5320 - val_acc: 0.7462
Epoch 86/100
390870/390870 [==============================] - 75s 193us/step - loss: 0.5014 - acc: 0.7593 - val_loss: 0.5244 - val_acc: 0.7509
Epoch 87/100
390870/390870 [==============================] - 71s 181us/step - loss: 0.5011 - acc: 0.7597 - val_loss: 0.5241 - val_acc: 0.7508
Epoch 88/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5008 - acc: 0.7597 - val_loss: 0.5280 - val_acc: 0.7500
Epoch 89/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5007 - acc: 0.7595 - val_loss: 0.5217 - val_acc: 0.7519
Epoch 90/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5014 - acc: 0.7594 - val_loss: 0.5226 - val_acc: 0.7497
Epoch 91/100
390870/390870 [==============================] - 68s 174us/step - loss: 0.5009 - acc: 0.7596 - val_loss: 0.5245 - val_acc: 0.7496
Epoch 92/100
390870/390870 [==============================] - 67s 173us/step - loss: 0.5010 - acc: 0.7590 - val_loss: 0.5284 - val_acc: 0.7489
Epoch 93/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.5005 - acc: 0.7598 - val_loss: 0.5343 - val_acc: 0.7456
Epoch 94/100
390870/390870 [==============================] - 67s 171us/step - loss: 0.5003 - acc: 0.7597 - val_loss: 0.5248 - val_acc: 0.7487
Epoch 95/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5003 - acc: 0.7601 - val_loss: 0.5264 - val_acc: 0.7486
Epoch 96/100
390870/390870 [==============================] - 66s 170us/step - loss: 0.4998 - acc: 0.7605 - val_loss: 0.5259 - val_acc: 0.7515
Epoch 97/100
390870/390870 [==============================] - 66s 169us/step - loss: 0.5000 - acc: 0.7607 - val_loss: 0.5246 - val_acc: 0.7491
Epoch 98/100
390870/390870 [==============================] - 66s 170us/step - loss: 0.4996 - acc: 0.7610 - val_loss: 0.5377 - val_acc: 0.7410
Epoch 99/100
390870/390870 [==============================] - 72s 184us/step - loss: 0.5007 - acc: 0.7601 - val_loss: 0.5272 - val_acc: 0.7484
Epoch 100/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.4996 - acc: 0.7608 - val_loss: 0.5210 - val_acc: 0.7509
Out[30]:
<keras.callbacks.History at 0x28588c76ac8>
In [31]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_all7.h5")
In [32]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 48s 87us/step
[0.5033024442936678, 0.7597557961701017]
In [33]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[33]:
(153993, 2774)
In [34]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[34]:
array([[1.],
       [0.],
       [1.],
       ...,
       [1.],
       [1.],
       [0.]], dtype=float32)
In [35]:
set(binary_prediction[:,0])
Out[35]:
{0.0, 1.0}
In [36]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.69      0.42      0.52     50930
           1       0.76      0.91      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.66      0.67    153993
weighted avg       0.74      0.74      0.72    153993

Accuracy for Deep Learning approach: 74.48585325306995
In [37]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[21192 29738]
 [ 9552 93511]]
In [38]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [39]:
_del_all()

NN 7

alpha = 6

# hidden layers = 5

batch size = 500

In [4]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[4]:
(558386, 2774)
In [41]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 6

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
34
In [42]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [43]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 500, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 60s 153us/step - loss: 0.6000 - acc: 0.6893 - val_loss: 0.5623 - val_acc: 0.7238
Epoch 2/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5547 - acc: 0.7240 - val_loss: 0.5505 - val_acc: 0.7272
Epoch 3/100
390870/390870 [==============================] - 51s 131us/step - loss: 0.5446 - acc: 0.7316 - val_loss: 0.5395 - val_acc: 0.7382
Epoch 4/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5374 - acc: 0.7368 - val_loss: 0.5323 - val_acc: 0.7432
Epoch 5/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5366 - acc: 0.7371 - val_loss: 0.5371 - val_acc: 0.7398
Epoch 6/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5339 - acc: 0.7391 - val_loss: 0.5352 - val_acc: 0.7395
Epoch 7/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5312 - acc: 0.7411 - val_loss: 0.5340 - val_acc: 0.7425
Epoch 8/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5290 - acc: 0.7423 - val_loss: 0.5493 - val_acc: 0.7272
Epoch 9/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5284 - acc: 0.7431 - val_loss: 0.5344 - val_acc: 0.7406
Epoch 10/100
390870/390870 [==============================] - 57s 145us/step - loss: 0.5262 - acc: 0.7445 - val_loss: 0.5320 - val_acc: 0.7417
Epoch 11/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5248 - acc: 0.7456 - val_loss: 0.5352 - val_acc: 0.7399
Epoch 12/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5244 - acc: 0.7457 - val_loss: 0.5249 - val_acc: 0.7468
Epoch 13/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5227 - acc: 0.7468 - val_loss: 0.5284 - val_acc: 0.7451
Epoch 14/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5223 - acc: 0.7481 - val_loss: 0.5247 - val_acc: 0.7475
Epoch 15/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5225 - acc: 0.7475 - val_loss: 0.5307 - val_acc: 0.7437
Epoch 16/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5212 - acc: 0.7476 - val_loss: 0.5321 - val_acc: 0.7429
Epoch 17/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5208 - acc: 0.7485 - val_loss: 0.5258 - val_acc: 0.7450
Epoch 18/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5198 - acc: 0.7484 - val_loss: 0.5278 - val_acc: 0.7457
Epoch 19/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5184 - acc: 0.7496 - val_loss: 0.5233 - val_acc: 0.7486
Epoch 20/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5182 - acc: 0.7496 - val_loss: 0.5240 - val_acc: 0.7476
Epoch 21/100
390870/390870 [==============================] - 50s 127us/step - loss: 0.5179 - acc: 0.7498 - val_loss: 0.5236 - val_acc: 0.7484
Epoch 22/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5163 - acc: 0.7512 - val_loss: 0.5221 - val_acc: 0.7483
Epoch 23/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5160 - acc: 0.7514 - val_loss: 0.5288 - val_acc: 0.7450
Epoch 24/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5153 - acc: 0.7514 - val_loss: 0.5319 - val_acc: 0.7449
Epoch 25/100
390870/390870 [==============================] - 56s 144us/step - loss: 0.5151 - acc: 0.7520 - val_loss: 0.5248 - val_acc: 0.7468
Epoch 26/100
390870/390870 [==============================] - 122s 311us/step - loss: 0.5141 - acc: 0.7520 - val_loss: 0.5198 - val_acc: 0.7498
Epoch 27/100
390870/390870 [==============================] - 53s 136us/step - loss: 0.5146 - acc: 0.7514 - val_loss: 0.5223 - val_acc: 0.7492
Epoch 28/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5138 - acc: 0.7523 - val_loss: 0.5232 - val_acc: 0.7480
Epoch 29/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5123 - acc: 0.7533 - val_loss: 0.5196 - val_acc: 0.7512
Epoch 30/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5130 - acc: 0.7525 - val_loss: 0.5197 - val_acc: 0.7503
Epoch 31/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5119 - acc: 0.7541 - val_loss: 0.5242 - val_acc: 0.7493
Epoch 32/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5123 - acc: 0.7531 - val_loss: 0.5248 - val_acc: 0.7480
Epoch 33/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5113 - acc: 0.7539 - val_loss: 0.5227 - val_acc: 0.7486
Epoch 34/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5113 - acc: 0.7541 - val_loss: 0.5208 - val_acc: 0.7507
Epoch 35/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5114 - acc: 0.7531 - val_loss: 0.5202 - val_acc: 0.7488
Epoch 36/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5106 - acc: 0.7544 - val_loss: 0.5245 - val_acc: 0.7461
Epoch 37/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5109 - acc: 0.7537 - val_loss: 0.5253 - val_acc: 0.7459
Epoch 38/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5099 - acc: 0.7546 - val_loss: 0.5179 - val_acc: 0.7513
Epoch 39/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5088 - acc: 0.7551 - val_loss: 0.5195 - val_acc: 0.7495
Epoch 40/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5093 - acc: 0.7549 - val_loss: 0.5187 - val_acc: 0.7514
Epoch 41/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5084 - acc: 0.7555 - val_loss: 0.5247 - val_acc: 0.7485
Epoch 42/100
390870/390870 [==============================] - 51s 131us/step - loss: 0.5084 - acc: 0.7553 - val_loss: 0.5258 - val_acc: 0.7479
Epoch 43/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5077 - acc: 0.7554 - val_loss: 0.5195 - val_acc: 0.7511
Epoch 44/100
390870/390870 [==============================] - 56s 144us/step - loss: 0.5073 - acc: 0.7555 - val_loss: 0.5191 - val_acc: 0.7501
Epoch 45/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5077 - acc: 0.7554 - val_loss: 0.5202 - val_acc: 0.7508
Epoch 46/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5069 - acc: 0.7567 - val_loss: 0.5241 - val_acc: 0.7480
Epoch 47/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5073 - acc: 0.7561 - val_loss: 0.5229 - val_acc: 0.7477
Epoch 48/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5076 - acc: 0.7555 - val_loss: 0.5235 - val_acc: 0.7488
Epoch 49/100
390870/390870 [==============================] - 53s 134us/step - loss: 0.5053 - acc: 0.7571 - val_loss: 0.5196 - val_acc: 0.7500
Epoch 50/100
390870/390870 [==============================] - 202s 517us/step - loss: 0.5063 - acc: 0.7562 - val_loss: 0.5226 - val_acc: 0.7473
Epoch 51/100
390870/390870 [==============================] - 51s 129us/step - loss: 0.5055 - acc: 0.7571 - val_loss: 0.5298 - val_acc: 0.7420
Epoch 52/100
390870/390870 [==============================] - 51s 130us/step - loss: 0.5054 - acc: 0.7571 - val_loss: 0.5174 - val_acc: 0.7514
Epoch 53/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5054 - acc: 0.7568 - val_loss: 0.5236 - val_acc: 0.7496
Epoch 54/100
390870/390870 [==============================] - 50s 127us/step - loss: 0.5045 - acc: 0.7576 - val_loss: 0.5235 - val_acc: 0.7474
Epoch 55/100
390870/390870 [==============================] - 59s 150us/step - loss: 0.5046 - acc: 0.7576 - val_loss: 0.5263 - val_acc: 0.7475
Epoch 56/100
390870/390870 [==============================] - 61s 156us/step - loss: 0.5040 - acc: 0.7581 - val_loss: 0.5194 - val_acc: 0.7503
Epoch 57/100
390870/390870 [==============================] - 55s 142us/step - loss: 0.5051 - acc: 0.7571 - val_loss: 0.5175 - val_acc: 0.7527
Epoch 58/100
390870/390870 [==============================] - 58s 147us/step - loss: 0.5044 - acc: 0.7573 - val_loss: 0.5236 - val_acc: 0.7504
Epoch 59/100
390870/390870 [==============================] - 53s 136us/step - loss: 0.5040 - acc: 0.7578 - val_loss: 0.5192 - val_acc: 0.7520
Epoch 60/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5034 - acc: 0.7588 - val_loss: 0.5231 - val_acc: 0.7494
Epoch 61/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5031 - acc: 0.7582 - val_loss: 0.5236 - val_acc: 0.7493
Epoch 62/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5036 - acc: 0.7582 - val_loss: 0.5184 - val_acc: 0.7514
Epoch 63/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5038 - acc: 0.7582 - val_loss: 0.5226 - val_acc: 0.7502
Epoch 64/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5035 - acc: 0.7580 - val_loss: 0.5202 - val_acc: 0.7512
Epoch 65/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5029 - acc: 0.7586 - val_loss: 0.5237 - val_acc: 0.7473
Epoch 66/100
390870/390870 [==============================] - 52s 134us/step - loss: 0.5023 - acc: 0.7586 - val_loss: 0.5242 - val_acc: 0.7488
Epoch 67/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5031 - acc: 0.7582 - val_loss: 0.5237 - val_acc: 0.7483
Epoch 68/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5023 - acc: 0.7591 - val_loss: 0.5277 - val_acc: 0.7460
Epoch 69/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5024 - acc: 0.7587 - val_loss: 0.5266 - val_acc: 0.7505
Epoch 70/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5028 - acc: 0.7586 - val_loss: 0.5228 - val_acc: 0.7484
Epoch 71/100
390870/390870 [==============================] - 50s 127us/step - loss: 0.5015 - acc: 0.7593 - val_loss: 0.5199 - val_acc: 0.7515
Epoch 72/100
390870/390870 [==============================] - 50s 127us/step - loss: 0.5018 - acc: 0.7589 - val_loss: 0.5213 - val_acc: 0.7502
Epoch 73/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.5016 - acc: 0.7593 - val_loss: 0.5244 - val_acc: 0.7480
Epoch 74/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5013 - acc: 0.7595 - val_loss: 0.5251 - val_acc: 0.7483
Epoch 75/100
390870/390870 [==============================] - 54s 138us/step - loss: 0.5010 - acc: 0.7599 - val_loss: 0.5296 - val_acc: 0.7451
Epoch 76/100
390870/390870 [==============================] - 55s 140us/step - loss: 0.5010 - acc: 0.7598 - val_loss: 0.5251 - val_acc: 0.7493
Epoch 77/100
390870/390870 [==============================] - 64s 163us/step - loss: 0.5003 - acc: 0.7599 - val_loss: 0.5200 - val_acc: 0.7535
Epoch 78/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.5009 - acc: 0.7596 - val_loss: 0.5217 - val_acc: 0.7486
Epoch 79/100
390870/390870 [==============================] - 68s 173us/step - loss: 0.5000 - acc: 0.7596 - val_loss: 0.5240 - val_acc: 0.7490
Epoch 80/100
390870/390870 [==============================] - 76s 196us/step - loss: 0.5000 - acc: 0.7600 - val_loss: 0.5238 - val_acc: 0.7480
Epoch 81/100
390870/390870 [==============================] - 169s 433us/step - loss: 0.4997 - acc: 0.7599 - val_loss: 0.5271 - val_acc: 0.7491
Epoch 82/100
390870/390870 [==============================] - 59s 150us/step - loss: 0.4992 - acc: 0.7605 - val_loss: 0.5217 - val_acc: 0.7517
Epoch 83/100
390870/390870 [==============================] - 91s 234us/step - loss: 0.4992 - acc: 0.7603 - val_loss: 0.5226 - val_acc: 0.7511
Epoch 84/100
390870/390870 [==============================] - 69s 177us/step - loss: 0.5003 - acc: 0.7594 - val_loss: 0.5291 - val_acc: 0.7458
Epoch 85/100
390870/390870 [==============================] - 50s 128us/step - loss: 0.4991 - acc: 0.7601 - val_loss: 0.5210 - val_acc: 0.7500
Epoch 86/100
390870/390870 [==============================] - 51s 131us/step - loss: 0.4992 - acc: 0.7599 - val_loss: 0.5225 - val_acc: 0.7509
Epoch 87/100
390870/390870 [==============================] - 52s 133us/step - loss: 0.4989 - acc: 0.7604 - val_loss: 0.5234 - val_acc: 0.7493
Epoch 88/100
390870/390870 [==============================] - 53s 137us/step - loss: 0.4979 - acc: 0.7608 - val_loss: 0.5315 - val_acc: 0.7489
Epoch 89/100
390870/390870 [==============================] - 61s 156us/step - loss: 0.4985 - acc: 0.7607 - val_loss: 0.5237 - val_acc: 0.7487
Epoch 90/100
390870/390870 [==============================] - 74s 188us/step - loss: 0.4992 - acc: 0.7602 - val_loss: 0.5223 - val_acc: 0.7511
Epoch 91/100
390870/390870 [==============================] - 85s 216us/step - loss: 0.4984 - acc: 0.7611 - val_loss: 0.5245 - val_acc: 0.7499
Epoch 92/100
390870/390870 [==============================] - 67s 172us/step - loss: 0.4972 - acc: 0.7615 - val_loss: 0.5271 - val_acc: 0.7492
Epoch 93/100
390870/390870 [==============================] - 61s 155us/step - loss: 0.4976 - acc: 0.7611 - val_loss: 0.5268 - val_acc: 0.7457
Epoch 94/100
390870/390870 [==============================] - 60s 152us/step - loss: 0.4977 - acc: 0.7612 - val_loss: 0.5360 - val_acc: 0.7415
Epoch 95/100
390870/390870 [==============================] - 60s 153us/step - loss: 0.4979 - acc: 0.7608 - val_loss: 0.5229 - val_acc: 0.7495
Epoch 96/100
390870/390870 [==============================] - 54s 138us/step - loss: 0.4976 - acc: 0.7613 - val_loss: 0.5265 - val_acc: 0.7504
Epoch 97/100
390870/390870 [==============================] - 56s 142us/step - loss: 0.4971 - acc: 0.7614 - val_loss: 0.5264 - val_acc: 0.7495
Epoch 98/100
390870/390870 [==============================] - 58s 147us/step - loss: 0.4969 - acc: 0.7613 - val_loss: 0.5283 - val_acc: 0.7470
Epoch 99/100
390870/390870 [==============================] - 50s 127us/step - loss: 0.4975 - acc: 0.7614 - val_loss: 0.5328 - val_acc: 0.7460
Epoch 100/100
390870/390870 [==============================] - 50s 127us/step - loss: 0.4967 - acc: 0.7620 - val_loss: 0.5406 - val_acc: 0.7454
Out[43]:
<keras.callbacks.History at 0x2859abf6860>
In [44]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_all8.h5")
In [5]:
# evaluation of the model training
classifier = _load_model("../models/trained_deep_neural_network_all8.h5")  # DEBUG!!!
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 54s 97us/step
[0.5078941215270898, 0.7580132739713417]
In [6]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[6]:
(153993, 2774)
In [7]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[7]:
array([[0.],
       [0.],
       [1.],
       ...,
       [1.],
       [1.],
       [1.]], dtype=float32)
In [8]:
set(binary_prediction[:,0])
Out[8]:
{0.0, 1.0}
In [9]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.67      0.44      0.53     50930
           1       0.76      0.89      0.82    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.67      0.68    153993
weighted avg       0.73      0.74      0.73    153993

Accuracy for Deep Learning approach: 74.25792081458248
In [10]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[22257 28673]
 [10968 92095]]
In [11]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [ ]:
_del_all()

NN 8

alpha = 8

# hidden layers = 5

batch size = 100

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_2.pickle')
train_set.shape
Out[3]:
(558386, 2774)
In [4]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 8

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
25
In [5]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [6]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 97s 249us/step - loss: 0.5884 - acc: 0.6991 - val_loss: 0.5596 - val_acc: 0.7209
Epoch 2/100
390870/390870 [==============================] - 132s 339us/step - loss: 0.5530 - acc: 0.7257 - val_loss: 0.5444 - val_acc: 0.7335
Epoch 3/100
390870/390870 [==============================] - 92s 235us/step - loss: 0.5437 - acc: 0.7323 - val_loss: 0.5402 - val_acc: 0.7374
Epoch 4/100
390870/390870 [==============================] - 80s 205us/step - loss: 0.5398 - acc: 0.7353 - val_loss: 0.5386 - val_acc: 0.7420
Epoch 5/100
390870/390870 [==============================] - 76s 194us/step - loss: 0.5376 - acc: 0.7369 - val_loss: 0.5458 - val_acc: 0.7319
Epoch 6/100
390870/390870 [==============================] - 78s 198us/step - loss: 0.5349 - acc: 0.7388 - val_loss: 0.5438 - val_acc: 0.7389
Epoch 7/100
390870/390870 [==============================] - 76s 195us/step - loss: 0.5334 - acc: 0.7397 - val_loss: 0.5317 - val_acc: 0.7416
Epoch 8/100
390870/390870 [==============================] - 74s 190us/step - loss: 0.5311 - acc: 0.7415 - val_loss: 0.5345 - val_acc: 0.7405
Epoch 9/100
390870/390870 [==============================] - 76s 195us/step - loss: 0.5299 - acc: 0.7424 - val_loss: 0.5317 - val_acc: 0.7461
Epoch 10/100
390870/390870 [==============================] - 81s 206us/step - loss: 0.5287 - acc: 0.7425 - val_loss: 0.5283 - val_acc: 0.7460
Epoch 11/100
390870/390870 [==============================] - 95s 244us/step - loss: 0.5276 - acc: 0.7439 - val_loss: 0.5367 - val_acc: 0.7427
Epoch 12/100
390870/390870 [==============================] - 78s 199us/step - loss: 0.5263 - acc: 0.7445 - val_loss: 0.5266 - val_acc: 0.7483
Epoch 13/100
390870/390870 [==============================] - 75s 193us/step - loss: 0.5253 - acc: 0.7453 - val_loss: 0.5278 - val_acc: 0.7463
Epoch 14/100
390870/390870 [==============================] - 76s 195us/step - loss: 0.5242 - acc: 0.7459 - val_loss: 0.5301 - val_acc: 0.7438
Epoch 15/100
390870/390870 [==============================] - 80s 206us/step - loss: 0.5239 - acc: 0.7464 - val_loss: 0.5226 - val_acc: 0.7489
Epoch 16/100
390870/390870 [==============================] - 74s 188us/step - loss: 0.5224 - acc: 0.7475 - val_loss: 0.5273 - val_acc: 0.7468
Epoch 17/100
390870/390870 [==============================] - 74s 190us/step - loss: 0.5222 - acc: 0.7471 - val_loss: 0.5276 - val_acc: 0.7452
Epoch 18/100
390870/390870 [==============================] - 72s 184us/step - loss: 0.5216 - acc: 0.7483 - val_loss: 0.5333 - val_acc: 0.7430
Epoch 19/100
390870/390870 [==============================] - 72s 185us/step - loss: 0.5208 - acc: 0.7486 - val_loss: 0.5300 - val_acc: 0.7431
Epoch 20/100
390870/390870 [==============================] - 72s 184us/step - loss: 0.5201 - acc: 0.7489 - val_loss: 0.5244 - val_acc: 0.7486
Epoch 21/100
390870/390870 [==============================] - 72s 185us/step - loss: 0.5198 - acc: 0.7493 - val_loss: 0.5246 - val_acc: 0.7478
Epoch 22/100
390870/390870 [==============================] - 73s 186us/step - loss: 0.5195 - acc: 0.7493 - val_loss: 0.5225 - val_acc: 0.7476
Epoch 23/100
390870/390870 [==============================] - 72s 185us/step - loss: 0.5189 - acc: 0.7499 - val_loss: 0.5311 - val_acc: 0.7405
Epoch 24/100
390870/390870 [==============================] - 74s 189us/step - loss: 0.5188 - acc: 0.7495 - val_loss: 0.5260 - val_acc: 0.7463
Epoch 25/100
390870/390870 [==============================] - 74s 190us/step - loss: 0.5180 - acc: 0.7506 - val_loss: 0.5273 - val_acc: 0.7459
Epoch 26/100
390870/390870 [==============================] - 73s 187us/step - loss: 0.5176 - acc: 0.7511 - val_loss: 0.5262 - val_acc: 0.7446
Epoch 27/100
390870/390870 [==============================] - 73s 186us/step - loss: 0.5171 - acc: 0.7505 - val_loss: 0.5367 - val_acc: 0.7423
Epoch 28/100
390870/390870 [==============================] - 92s 236us/step - loss: 0.5163 - acc: 0.7516 - val_loss: 0.5235 - val_acc: 0.7469
Epoch 29/100
390870/390870 [==============================] - 78s 198us/step - loss: 0.5163 - acc: 0.7515 - val_loss: 0.5213 - val_acc: 0.7492
Epoch 30/100
390870/390870 [==============================] - 76s 195us/step - loss: 0.5162 - acc: 0.7514 - val_loss: 0.5226 - val_acc: 0.7480
Epoch 31/100
390870/390870 [==============================] - 72s 185us/step - loss: 0.5155 - acc: 0.7518 - val_loss: 0.5293 - val_acc: 0.7445
Epoch 32/100
390870/390870 [==============================] - 73s 187us/step - loss: 0.5146 - acc: 0.7524 - val_loss: 0.5318 - val_acc: 0.7457
Epoch 33/100
390870/390870 [==============================] - 73s 186us/step - loss: 0.5144 - acc: 0.7521 - val_loss: 0.5209 - val_acc: 0.7499
Epoch 34/100
390870/390870 [==============================] - 72s 184us/step - loss: 0.5143 - acc: 0.7529 - val_loss: 0.5211 - val_acc: 0.7481
Epoch 35/100
390870/390870 [==============================] - 73s 185us/step - loss: 0.5134 - acc: 0.7530 - val_loss: 0.5218 - val_acc: 0.7470
Epoch 36/100
390870/390870 [==============================] - 75s 193us/step - loss: 0.5132 - acc: 0.7532 - val_loss: 0.5192 - val_acc: 0.7500
Epoch 37/100
390870/390870 [==============================] - 77s 196us/step - loss: 0.5128 - acc: 0.7533 - val_loss: 0.5198 - val_acc: 0.7502
Epoch 38/100
390870/390870 [==============================] - 77s 197us/step - loss: 0.5133 - acc: 0.7531 - val_loss: 0.5205 - val_acc: 0.7489
Epoch 39/100
390870/390870 [==============================] - 77s 197us/step - loss: 0.5127 - acc: 0.7534 - val_loss: 0.5202 - val_acc: 0.7491
Epoch 40/100
390870/390870 [==============================] - 82s 210us/step - loss: 0.5122 - acc: 0.7536 - val_loss: 0.5198 - val_acc: 0.7494
Epoch 41/100
390870/390870 [==============================] - 77s 197us/step - loss: 0.5120 - acc: 0.7536 - val_loss: 0.5220 - val_acc: 0.7478
Epoch 42/100
390870/390870 [==============================] - 76s 195us/step - loss: 0.5120 - acc: 0.7537 - val_loss: 0.5203 - val_acc: 0.7497
Epoch 43/100
390870/390870 [==============================] - 77s 196us/step - loss: 0.5117 - acc: 0.7541 - val_loss: 0.5228 - val_acc: 0.7490
Epoch 44/100
390870/390870 [==============================] - 74s 188us/step - loss: 0.5109 - acc: 0.7545 - val_loss: 0.5198 - val_acc: 0.7495
Epoch 45/100
390870/390870 [==============================] - 73s 186us/step - loss: 0.5139 - acc: 0.7526 - val_loss: 0.5190 - val_acc: 0.7505
Epoch 46/100
390870/390870 [==============================] - 73s 188us/step - loss: 0.5104 - acc: 0.7548 - val_loss: 0.5273 - val_acc: 0.7465
Epoch 47/100
390870/390870 [==============================] - 73s 187us/step - loss: 0.5104 - acc: 0.7545 - val_loss: 0.5218 - val_acc: 0.7480
Epoch 48/100
390870/390870 [==============================] - 79s 202us/step - loss: 0.5100 - acc: 0.7545 - val_loss: 0.5193 - val_acc: 0.7506
Epoch 49/100
390870/390870 [==============================] - 78s 200us/step - loss: 0.5098 - acc: 0.7549 - val_loss: 0.5275 - val_acc: 0.7479
Epoch 50/100
390870/390870 [==============================] - 90s 230us/step - loss: 0.5096 - acc: 0.7550 - val_loss: 0.5255 - val_acc: 0.7474
Epoch 51/100
390870/390870 [==============================] - 100s 255us/step - loss: 0.5093 - acc: 0.7553 - val_loss: 0.5207 - val_acc: 0.7522
Epoch 52/100
390870/390870 [==============================] - 79s 201us/step - loss: 0.5093 - acc: 0.7549 - val_loss: 0.5196 - val_acc: 0.7514
Epoch 53/100
390870/390870 [==============================] - 90s 230us/step - loss: 0.5086 - acc: 0.7554 - val_loss: 0.5198 - val_acc: 0.7516
Epoch 54/100
390870/390870 [==============================] - 91s 232us/step - loss: 0.5092 - acc: 0.7558 - val_loss: 0.5237 - val_acc: 0.7477
Epoch 55/100
390870/390870 [==============================] - 103s 265us/step - loss: 0.5085 - acc: 0.7556 - val_loss: 0.5259 - val_acc: 0.7481
Epoch 56/100
390870/390870 [==============================] - 112s 287us/step - loss: 0.5084 - acc: 0.7555 - val_loss: 0.5217 - val_acc: 0.7498
Epoch 57/100
390870/390870 [==============================] - 94s 241us/step - loss: 0.5083 - acc: 0.7555 - val_loss: 0.5194 - val_acc: 0.7516
Epoch 58/100
390870/390870 [==============================] - 89s 228us/step - loss: 0.5079 - acc: 0.7563 - val_loss: 0.5220 - val_acc: 0.7506
Epoch 59/100
390870/390870 [==============================] - 82s 210us/step - loss: 0.5080 - acc: 0.7557 - val_loss: 0.5229 - val_acc: 0.7503
Epoch 60/100
390870/390870 [==============================] - 77s 196us/step - loss: 0.5073 - acc: 0.7559 - val_loss: 0.5187 - val_acc: 0.7518
Epoch 61/100
390870/390870 [==============================] - 77s 197us/step - loss: 0.5070 - acc: 0.7566 - val_loss: 0.5272 - val_acc: 0.7465
Epoch 62/100
390870/390870 [==============================] - 85s 217us/step - loss: 0.5070 - acc: 0.7570 - val_loss: 0.5198 - val_acc: 0.7506
Epoch 63/100
390870/390870 [==============================] - 101s 258us/step - loss: 0.5067 - acc: 0.7566 - val_loss: 0.5224 - val_acc: 0.7503
Epoch 64/100
390870/390870 [==============================] - 113s 289us/step - loss: 0.5063 - acc: 0.7567 - val_loss: 0.5261 - val_acc: 0.7476
Epoch 65/100
390870/390870 [==============================] - 117s 299us/step - loss: 0.5071 - acc: 0.7564 - val_loss: 0.5200 - val_acc: 0.7499
Epoch 66/100
390870/390870 [==============================] - 111s 285us/step - loss: 0.5063 - acc: 0.7565 - val_loss: 0.5235 - val_acc: 0.7477
Epoch 67/100
390870/390870 [==============================] - 88s 226us/step - loss: 0.5065 - acc: 0.7561 - val_loss: 0.5248 - val_acc: 0.7497
Epoch 68/100
390870/390870 [==============================] - 89s 228us/step - loss: 0.5064 - acc: 0.7562 - val_loss: 0.5201 - val_acc: 0.7511
Epoch 69/100
390870/390870 [==============================] - 82s 209us/step - loss: 0.5064 - acc: 0.7568 - val_loss: 0.5265 - val_acc: 0.7486
Epoch 70/100
390870/390870 [==============================] - 82s 209us/step - loss: 0.5062 - acc: 0.7570 - val_loss: 0.5222 - val_acc: 0.7506
Epoch 71/100
390870/390870 [==============================] - 80s 206us/step - loss: 0.5058 - acc: 0.7564 - val_loss: 0.5197 - val_acc: 0.7507
Epoch 72/100
390870/390870 [==============================] - 84s 214us/step - loss: 0.5055 - acc: 0.7573 - val_loss: 0.5231 - val_acc: 0.7489
Epoch 73/100
390870/390870 [==============================] - 95s 244us/step - loss: 0.5052 - acc: 0.7573 - val_loss: 0.5216 - val_acc: 0.7492
Epoch 74/100
390870/390870 [==============================] - 100s 257us/step - loss: 0.5052 - acc: 0.7576 - val_loss: 0.5250 - val_acc: 0.7475
Epoch 75/100
390870/390870 [==============================] - 99s 253us/step - loss: 0.5055 - acc: 0.7571 - val_loss: 0.5184 - val_acc: 0.7519
Epoch 76/100
390870/390870 [==============================] - 84s 215us/step - loss: 0.5047 - acc: 0.7576 - val_loss: 0.5311 - val_acc: 0.7466
Epoch 77/100
390870/390870 [==============================] - 87s 223us/step - loss: 0.5048 - acc: 0.7577 - val_loss: 0.5252 - val_acc: 0.7476
Epoch 78/100
390870/390870 [==============================] - 84s 215us/step - loss: 0.5054 - acc: 0.7566 - val_loss: 0.5271 - val_acc: 0.7445
Epoch 79/100
390870/390870 [==============================] - 101s 259us/step - loss: 0.5045 - acc: 0.7573 - val_loss: 0.5220 - val_acc: 0.7510
Epoch 80/100
390870/390870 [==============================] - 107s 274us/step - loss: 0.5043 - acc: 0.7575 - val_loss: 0.5237 - val_acc: 0.7477
Epoch 81/100
390870/390870 [==============================] - 98s 250us/step - loss: 0.5042 - acc: 0.7577 - val_loss: 0.5262 - val_acc: 0.7489
Epoch 82/100
390870/390870 [==============================] - 101s 259us/step - loss: 0.5048 - acc: 0.7574 - val_loss: 0.5237 - val_acc: 0.7488
Epoch 83/100
390870/390870 [==============================] - 100s 256us/step - loss: 0.5037 - acc: 0.7581 - val_loss: 0.5210 - val_acc: 0.7500
Epoch 84/100
390870/390870 [==============================] - 82s 209us/step - loss: 0.5040 - acc: 0.7577 - val_loss: 0.5231 - val_acc: 0.7492
Epoch 85/100
390870/390870 [==============================] - 90s 230us/step - loss: 0.5039 - acc: 0.7577 - val_loss: 0.5263 - val_acc: 0.7485
Epoch 86/100
390870/390870 [==============================] - 81s 208us/step - loss: 0.5040 - acc: 0.7578 - val_loss: 0.5234 - val_acc: 0.7503
Epoch 87/100
390870/390870 [==============================] - 82s 211us/step - loss: 0.5032 - acc: 0.7582 - val_loss: 0.5262 - val_acc: 0.7494
Epoch 88/100
390870/390870 [==============================] - 84s 214us/step - loss: 0.5031 - acc: 0.7586 - val_loss: 0.5333 - val_acc: 0.7450
Epoch 89/100
390870/390870 [==============================] - 91s 233us/step - loss: 0.5028 - acc: 0.7583 - val_loss: 0.5206 - val_acc: 0.7496
Epoch 90/100
390870/390870 [==============================] - 87s 223us/step - loss: 0.5024 - acc: 0.7590 - val_loss: 0.5294 - val_acc: 0.7455
Epoch 91/100
390870/390870 [==============================] - 87s 223us/step - loss: 0.5033 - acc: 0.7585 - val_loss: 0.5391 - val_acc: 0.7455
Epoch 92/100
390870/390870 [==============================] - 87s 223us/step - loss: 0.5028 - acc: 0.7586 - val_loss: 0.5250 - val_acc: 0.7494
Epoch 93/100
390870/390870 [==============================] - 92s 234us/step - loss: 0.5032 - acc: 0.7589 - val_loss: 0.5213 - val_acc: 0.7506
Epoch 94/100
390870/390870 [==============================] - 87s 222us/step - loss: 0.5024 - acc: 0.7588 - val_loss: 0.5243 - val_acc: 0.7493
Epoch 95/100
390870/390870 [==============================] - 84s 216us/step - loss: 0.5023 - acc: 0.7582 - val_loss: 0.5226 - val_acc: 0.7497
Epoch 96/100
390870/390870 [==============================] - 86s 220us/step - loss: 0.5026 - acc: 0.7583 - val_loss: 0.5276 - val_acc: 0.7455
Epoch 97/100
390870/390870 [==============================] - 82s 209us/step - loss: 0.5020 - acc: 0.7587 - val_loss: 0.5239 - val_acc: 0.7504
Epoch 98/100
390870/390870 [==============================] - 78s 200us/step - loss: 0.5022 - acc: 0.7591 - val_loss: 0.5307 - val_acc: 0.7451
Epoch 99/100
390870/390870 [==============================] - 74s 190us/step - loss: 0.5018 - acc: 0.7589 - val_loss: 0.5308 - val_acc: 0.7447
Epoch 100/100
390870/390870 [==============================] - 87s 222us/step - loss: 0.5023 - acc: 0.7592 - val_loss: 0.5284 - val_acc: 0.7470
Out[6]:
<keras.callbacks.History at 0x19a25de07b8>
In [7]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_all9.h5")
In [8]:
# evaluation of the model training
classifier = _load_model("../models/trained_deep_neural_network_all8.h5")  # DEBUG!!!
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 58s 104us/step
[0.5078941215270898, 0.7580132739713417]
In [9]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_2.pickle')
test_set.shape
Out[9]:
(153993, 2774)
In [10]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[10]:
array([[0.],
       [0.],
       [1.],
       ...,
       [1.],
       [1.],
       [1.]], dtype=float32)
In [11]:
set(binary_prediction[:,0])
Out[11]:
{0.0, 1.0}
In [12]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.67      0.44      0.53     50930
           1       0.76      0.89      0.82    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.67      0.68    153993
weighted avg       0.73      0.74      0.73    153993

Accuracy for Deep Learning approach: 74.25792081458248
In [13]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[22257 28673]
 [10968 92095]]
In [14]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [15]:
_del_all()

7.4 SVM without fake reviews

In [16]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.shape
Out[16]:
(558386, 787)
In [17]:
train_set = train_set[train_set['bin_truth_score']!=-1]
train_set.shape
Out[17]:
(438571, 787)
In [18]:
best_model = _jl.load("../models/best_SVM.joblib")
best_model.set_params(verbose=10)
best_model.get_params()
Out[18]:
{'C': 0.001,
 'class_weight': None,
 'dual': True,
 'fit_intercept': True,
 'intercept_scaling': 1,
 'loss': 'squared_hinge',
 'max_iter': 50000,
 'multi_class': 'ovr',
 'penalty': 'l2',
 'random_state': 0,
 'tol': 0.0001,
 'verbose': 10}
In [19]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
[LibLinear]
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\sklearn\svm\base.py:929: ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.
  "the number of iterations.", ConvergenceWarning)
Out[19]:
LinearSVC(C=0.001, class_weight=None, dual=True, fit_intercept=True,
          intercept_scaling=1, loss='squared_hinge', max_iter=50000,
          multi_class='ovr', penalty='l2', random_state=0, tol=0.0001,
          verbose=10)
In [20]:
_jl.dump(best_model, "../models/best_SVM_fake.joblib")
Out[20]:
['../models/best_SVM_fake.joblib']
In [21]:
print("coef:", best_model.coef_)
print("intercept:", best_model.intercept_)
coef: [[-1.38515487e-01 -1.96857288e-01  3.32402060e-01 -2.91255272e-01
  -1.69144522e-01  4.39209749e-02  2.42195943e-02  3.64156997e-02
  -2.46790916e-02 -1.02787607e-02 -4.86873857e-02 -4.97584956e-05
  -3.22987927e-02  6.88759948e-03 -1.03371270e-05  6.50742395e-01
  -1.29562758e-04  1.03416295e-03 -2.48775106e-04 -6.59642374e-04
  -1.02036028e-04 -9.66791010e-04  1.14359591e-03 -1.99843890e-04
  -6.27763021e-05  3.69227212e-04  2.38756561e-06 -1.18488072e-05
   3.94093254e-05 -2.29569450e-04 -8.67235837e-02  8.89035480e-06
  -7.57976137e-04  3.93742819e-04 -4.55403368e-02 -3.54663199e-04
  -3.18629560e-02  5.23979128e-02  3.66236285e-04  1.77877828e-02
  -8.52227099e-04 -2.36155968e-04  9.56668382e-05  1.75930155e-04
   6.77026802e-05 -5.13757145e-03  1.89471500e-02 -8.09871718e-04
   3.60054753e-03  6.19090568e-03  3.79168413e-02 -8.57468383e-04
   5.00425524e-02  5.05812751e-03  2.68403245e-02  4.03692064e-02
   2.11274126e-02  2.88136834e-02  1.73991590e-02 -5.80037087e-03
  -6.35825432e-03 -1.38553686e-03  3.66697695e-03  4.09770364e-03
   5.23367932e-03  1.44654196e-02 -2.06413754e-02 -1.05040811e-02
   1.54167703e-05 -3.50850188e-03 -5.80631608e-03  2.45154713e-02
  -1.89971779e-02  2.69824728e-02  2.13443201e-02 -4.47468512e-03
   9.90304353e-03  6.76647465e-03  3.02454848e-03  4.90134130e-02
  -1.07009586e-01 -7.27267822e-02 -1.11518903e-01 -5.34070255e-02
  -1.12431655e-01 -1.25416591e-01 -1.07294096e-01 -9.18106856e-02
  -9.21504902e-02 -1.02044568e-01 -9.66343841e-02 -9.25763200e-02
  -7.10907125e-02 -7.08271627e-02 -8.05048905e-02 -6.88325061e-02
  -7.62345358e-02 -7.94675628e-02 -7.16101280e-02 -6.39430451e-02
   3.83200703e-03  1.24178084e-02  9.68276063e-03  2.88133507e-02
   2.43884053e-02 -1.92121205e-02  1.52672097e-02  2.76503358e-02
  -6.89404188e-02  3.38157093e-02  3.36085218e-02 -5.62950213e-02
  -1.08435124e-02  2.23313546e-02  2.16739386e-02  6.37260612e-02
  -6.09296555e-03 -7.98115368e-02 -1.11556390e-02 -2.02394940e-02
   1.55625280e-03 -3.04833429e-03  3.51311190e-03  1.77761517e-02
   3.52897528e-03  1.50427985e-02  3.06900169e-01 -1.69078975e-02
   2.04658792e-02 -5.41781120e-03  8.74748899e-03  6.06713893e-03
   2.54657184e-02 -1.47421821e-02  2.64501303e-02 -1.68806851e-01
   2.52562977e-02  5.16353327e-02 -4.30719531e-03 -1.02730195e-01
   8.36980545e-03 -9.62089988e-02  1.72674874e-02 -1.51363526e-02
  -3.72019637e-02  1.67622388e-02  4.69927485e-03  1.19233652e-02
   1.03741758e-03  1.64732370e-02 -5.66227503e-02  1.45361303e-02
   7.48036703e-03  1.16303879e-02  2.17216512e-02 -9.05917181e-02
   1.36110522e-02 -8.78600518e-02 -2.75945059e-02 -5.04582035e-03
  -1.31594374e-02 -1.10745049e-02  5.87097288e-02  1.38382937e-02
   1.89250223e-02  1.63221636e-02 -8.19418169e-02  3.01171136e-02
  -3.14307123e-02 -5.22839544e-02  1.80781129e-02 -3.60306879e-02
   4.25120155e-02 -1.89666077e-02  4.26568614e-02  8.16088208e-02
   2.23159007e-02  1.49230095e-02  9.70061169e-03 -1.49506638e-02
   5.83322804e-03 -1.06623735e-02  1.22299549e-02  1.26321864e-01
  -1.95112136e-02 -2.17022819e-02 -1.08929688e-01  4.78160264e-03
  -1.76506556e-02  1.32192451e-02  2.26198213e-02  5.68965596e-02
  -2.50675783e-02  1.41512352e-02  1.12231983e-01 -2.30265518e-02
  -3.15793429e-02  1.51440539e-02 -2.18803563e-02 -1.51582937e-02
   3.79622761e-03  5.07480631e-02 -5.88344749e-03 -2.12778463e-02
  -7.30262693e-02  1.12750777e-02 -1.87664119e-02 -1.95641783e-02
   7.13358408e-02 -1.46709873e-02  1.83772840e-02 -4.37025997e-03
  -6.51153200e-02  2.46084236e-02  1.09030656e-04 -5.37629407e-02
   1.50915967e-02 -2.81888030e-03 -1.31207292e-02 -8.25417527e-03
  -3.86140908e-03 -2.36720403e-02  5.54096386e-02 -1.65074819e-02
  -1.05828789e-02  1.86338447e-02 -9.78510134e-02 -1.11608225e-02
  -1.10487593e-03  2.08611273e-02 -1.69727108e-03 -8.69724266e-02
   8.67924904e-03  1.64268003e-02 -4.08082984e-02 -2.02054386e-02
   3.43418891e-03 -2.89795457e-02  5.75343825e-02 -7.93273187e-03
   3.64269591e-02  1.05707660e-02 -1.98823667e-02  2.72527226e-02
   2.68108272e-02 -1.65154569e-02 -1.09612647e-02  3.97277385e-02
   1.45050380e-02 -5.12933921e-02 -4.22742148e-02 -3.46014737e-02
  -2.95818189e-02 -8.18524611e-02 -1.10265342e-02 -7.17019280e-03
   3.73694134e-04  8.19734824e-04  1.65446464e-02  1.25424032e-02
  -3.11586076e-02 -1.79967174e-02  6.84022293e-02 -1.35733274e-03
   2.44723270e-02 -3.34684862e-02 -3.22164793e-02  3.96808054e-03
  -1.86445809e-02 -8.67969553e-03  5.73984014e-03  9.37409516e-03
  -2.42438114e-02  1.89837635e-02 -7.64081435e-03 -4.67132415e-02
  -4.18300627e-03  2.05551077e-02 -4.88350793e-03 -1.86020479e-02
  -1.80049826e-02 -1.66074364e-02 -3.35520438e-03 -4.55070752e-02
   8.87395027e-03  1.28799348e-02  1.24229580e-02  1.12108721e-02
   1.45350581e-02 -2.29478080e-03 -5.51076001e-03 -2.28911425e-02
   7.48309204e-03 -6.79165167e-03  5.02895413e-03 -2.44546254e-03
  -4.87119098e-02  2.36266258e-03 -1.11751431e-03  4.33818702e-03
  -3.49420391e-03 -5.05476299e-02 -1.40039999e-03  3.27457911e-02
  -3.38286113e-03  9.16096697e-03 -4.68945199e-02  1.16020096e-02
  -6.04269370e-03  5.29924476e-03  1.32220965e-02  7.50497264e-03
  -7.88100646e-03 -1.75782857e-02 -6.37016406e-05 -1.00199407e-03
  -8.13866917e-04  1.88164705e-02 -1.28928581e-02 -5.23228929e-02
   1.84327185e-03  2.48377979e-02  4.08759215e-03  1.58755783e-02
  -3.76921166e-03  9.06916880e-03 -7.59932359e-03 -5.00339138e-02
   7.70580116e-03 -1.14642255e-02  3.66738698e-02 -3.86449439e-02
  -2.54845909e-02  1.63077941e-03  1.88310809e-02  1.74969020e-02
  -2.40183982e-02  8.83242193e-04 -8.71008343e-02 -7.57132392e-02
   2.26391977e-03  8.08202923e-03 -1.26930871e-02  3.25206351e-02
  -1.06429423e-02 -4.72443757e-03  4.36542243e-03 -3.30372496e-02
   5.77410389e-03 -2.22521881e-02  2.69829707e-02  1.39020942e-02
   3.90491981e-02 -8.28080995e-03  1.41291808e-02  4.47068889e-02
   4.21503603e-03  1.62920223e-02 -1.98530058e-02  2.11423809e-02
   5.04514238e-03 -8.08404004e-03 -2.71891009e-03  1.37625053e-02
   5.24562637e-02 -9.63055223e-04  1.17774946e-02  2.26308559e-03
   1.55985850e-02 -1.54669393e-02  2.70475234e-02  2.03787907e-02
  -2.21858211e-02  2.34364788e-04  5.68517038e-03 -8.54713483e-04
   1.54705314e-03 -2.35092407e-02  1.95500928e-02  9.16686779e-03
   8.20092446e-03  1.85429685e-02 -1.19026990e-02 -4.15252089e-02
  -9.96497140e-03  2.77428126e-02  2.31637304e-02 -8.63998865e-03
  -5.58969579e-03 -3.65804708e-02  1.02017273e-02  1.46447403e-02
  -5.32412891e-03  1.28184056e-02 -1.08280514e-03  4.64343466e-03
   2.68664654e-02  1.43696379e-02  4.50606194e-03  1.24790375e-02
   2.74475140e-03  3.26106722e-02 -6.94968408e-03  5.07915763e-03
  -5.32758708e-02 -2.07407535e-03  3.95504430e-03 -1.52842777e-02
  -1.47755630e-02  7.30146062e-03  8.55936075e-03 -1.25183829e-02
  -9.87121313e-03  1.83879276e-02 -1.34606118e-02 -2.59937612e-02
  -3.91079757e-02 -6.51453628e-03  5.07647070e-03 -1.35042537e-03
   4.98107954e-03 -1.77086859e-02 -3.51357363e-02  8.45129969e-03
  -3.29347257e-02  2.17426186e-02 -2.59214434e-02 -2.53633189e-02
   2.70112333e-02 -2.29994019e-02  2.15643648e-02  3.30667821e-02
  -1.21527344e-02 -8.04762544e-03  2.27048019e-03 -3.57113700e-02
  -1.95711470e-02  1.45399921e-02 -3.39962061e-02  5.11852896e-03
   7.12968111e-02 -9.61767322e-03  9.31277872e-03 -2.38795810e-02
  -1.13772699e-02 -1.01792396e-02  2.39231488e-03  6.68334194e-03
   1.55671785e-02  2.05874551e-02 -4.80607788e-03 -1.86829790e-02
  -1.45709490e-02 -1.06623722e-02  2.35326667e-02 -4.70131386e-03
  -4.21315624e-03 -2.71234831e-03  1.93383614e-03 -1.89008818e-02
   4.86325848e-03  5.09044025e-02  7.89412493e-03 -1.85020881e-02
  -1.65166310e-02  2.59124645e-02 -3.96222717e-03 -1.82581455e-03
  -2.84639480e-03 -6.21755254e-04  5.29924476e-03 -1.50841079e-02
  -4.39750044e-03 -4.42544188e-03  1.09647180e-02 -9.85470723e-03
   1.74375033e-02  4.02869950e-03 -9.80037647e-03  4.82269799e-03
  -5.86701112e-02 -9.05448823e-03  3.29424010e-02 -3.10830934e-02
  -4.07787452e-02  1.32919262e-02  6.53352095e-02 -5.48590234e-03
   1.02267075e-02  6.69731008e-03  2.74703725e-02  1.88225893e-02
   6.64172161e-02  9.56234199e-03  5.69613301e-03  1.00842961e-02
   2.02640525e-03  4.17134666e-03  8.80238582e-04 -9.58839652e-03
  -1.12144824e-02 -1.68011935e-02  1.76108325e-02 -2.49569806e-02
   2.10639397e-02  1.62382271e-02 -9.51530479e-03  3.35571782e-02
   8.82414610e-04 -6.01971158e-03  1.40010253e-02 -2.23258493e-03
   1.09818752e-02 -1.32184042e-02 -1.16853055e-02  7.75300062e-03
  -2.26713625e-03  2.36595861e-02 -2.33222683e-02 -5.46096270e-02
   4.16589791e-02 -1.03794442e-03 -3.14466420e-02  2.01603301e-02
  -2.96612174e-02  5.00838207e-03 -2.72989050e-02  2.37512453e-02
  -1.19723609e-02 -1.16283640e-03  1.77372095e-02 -1.47336557e-02
   1.31102188e-03  8.51543272e-03 -6.68587037e-03  1.99759053e-02
  -3.03666952e-02 -1.59449661e-02 -7.98116436e-03  6.09597181e-03
   3.07038472e-02  1.60983762e-03 -2.55458962e-02  2.57303489e-02
   3.98623009e-03 -2.49897178e-03 -4.61094740e-02  3.68646964e-02
   6.88660748e-03  1.87634909e-02  7.88533428e-03 -3.23301450e-03
  -5.52813662e-03 -1.78606036e-02 -1.79463803e-02 -3.57198186e-02
   2.53094829e-02  3.93922796e-03  1.61779464e-02  1.90456601e-02
   2.49921754e-02  1.21722342e-02  1.34071085e-03  1.45581362e-02
  -2.14437115e-03  2.24086577e-02 -4.53714331e-02 -2.74531767e-03
  -3.40563217e-03 -1.26440357e-02 -3.12196341e-02  1.86247094e-02
  -3.28087898e-03  1.82956600e-03 -3.64854486e-02  1.26395711e-02
   4.91543431e-03  1.50202129e-03 -4.86245017e-03  1.39062633e-03
   1.69879728e-03  7.54090643e-03 -1.10152787e-02 -4.04248367e-03
  -7.24537298e-04 -4.03502454e-02 -3.46059332e-02 -7.92174591e-04
  -2.78487445e-02  4.22124460e-03 -2.34118625e-03  4.68893202e-02
  -2.40964309e-02  6.58263426e-03  0.00000000e+00 -1.06404899e-02
  -6.57115036e-04  1.83736987e-03  4.71624881e-03  4.59326667e-03
  -2.11296436e-02  7.69920334e-03  9.63588835e-03  6.97926589e-03
  -1.31042500e-02  5.12822543e-04  1.13357609e-02  7.51542321e-04
  -2.96353118e-02 -1.27433769e-02  4.35726615e-03  6.12298723e-04
  -2.87901293e-02  8.37917698e-03  4.15168647e-03  6.05821477e-03
   1.69256363e-02 -3.96513268e-02  8.41897903e-03  2.59189050e-02
  -2.40119082e-02  3.51413642e-02 -1.20591670e-02  4.02558898e-03
   3.64187455e-02 -2.15658178e-02 -8.83495685e-03 -9.75622677e-05
  -1.07509087e-02  5.12346974e-03 -3.10714314e-02 -9.04470855e-03
  -1.59977282e-02 -1.46438349e-02 -1.78883469e-02 -1.92557780e-02
   1.78333800e-03 -1.03883290e-03  9.93805978e-04 -2.24903444e-02
   1.16720118e-07  5.36833867e-02  1.95375589e-03 -2.11893072e-02
   1.07674429e-02 -5.14852512e-03 -5.69404369e-02 -4.48608520e-02
   7.93865362e-02  1.16369602e-02  8.60992692e-03  4.13415067e-03
  -2.38109424e-02 -1.16553406e-02  2.12244496e-03 -3.24198574e-02
   1.85734204e-02 -6.67553656e-03  0.00000000e+00  8.93218329e-03
  -3.78716311e-02 -6.08784528e-03 -2.19712782e-02  3.89818898e-03
   5.78575117e-03  2.84085929e-03 -1.11183069e-02  4.26315805e-03
  -5.78328837e-03 -1.86083811e-02 -9.31472045e-03  1.71179464e-02
   2.78352707e-03 -5.24710885e-03 -1.58906177e-02 -2.57995935e-02
  -3.54469438e-03 -3.04936422e-03 -2.23340727e-02 -2.05275732e-03
  -1.51540270e-03 -2.89039785e-03  1.35720572e-02  5.00110540e-03
  -2.03782712e-03  1.31302093e-02  3.12155947e-03  9.66232590e-03
  -3.82160421e-03  1.59629542e-02 -3.13063947e-02 -7.70711547e-02
  -4.53873213e-02  2.44396713e-02  1.16176026e-02 -4.28616324e-03
   2.93571138e-03 -3.21378840e-03 -5.05168012e-03  2.14775046e-02
   9.30468055e-03  1.07742979e-02  1.09680076e-02  3.36540862e-03
   1.01536295e-02  2.41527039e-03 -5.70225641e-03  1.03883289e-02
   1.02343923e-03  1.32746365e-02 -8.20576467e-04 -6.66446767e-03
   7.90373582e-04  7.65092311e-03  1.32660069e-02  2.26848794e-02
   2.28302296e-03 -1.40102265e-02 -3.71336568e-03  2.64946628e-02
  -1.79877478e-02  1.81385643e-02  3.41749975e-02 -3.40065409e-03
   5.02072962e-03 -8.78754424e-04 -2.26040006e-03  1.29404119e-03
  -5.50349194e-03  2.05685501e-03 -2.00369264e-03  9.98966381e-03
  -3.18727710e-02  3.82714327e-05  3.85631484e-04  1.10011642e-02
  -2.09306138e-02 -1.51879536e-02 -1.86591424e-02 -4.32213442e-02
   1.26460455e-02  2.51820690e-02  7.44768353e-03  2.62360977e-02
  -3.11786477e-03  1.65680606e-02 -1.20075957e-02 -2.05131750e-03
   5.09738736e-03  3.23257670e-02 -3.96595964e-03  1.85930483e-02
  -2.01630691e-02  1.19437868e-02  1.96814717e-03  1.59577917e-02
  -1.34377019e-02  3.74845529e-04  1.97689620e-02 -8.72276622e-04
   2.34626536e-03 -2.57764641e-02 -1.16449587e-02  1.23092515e-02
   3.96490754e-02 -3.16921730e-02 -2.32013547e-02  6.75377040e-04
  -6.54763055e-04  9.46946754e-03  5.48372418e-04  7.70140168e-03
  -4.63439632e-02 -7.70876405e-04  2.54325016e-02  2.66350587e-02
   3.31780889e-02  0.00000000e+00 -1.35134870e-02  6.09026263e-04
   3.24610758e-03 -4.83750860e-03]]
intercept: [-0.29125527]
In [22]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set.shape
Out[22]:
(153993, 787)
In [23]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
predictions:
 [0 1 0 ... 0 1 1]
In [24]:
set(predic)
Out[24]:
{0, 1}
In [25]:
# evaluate classifier

print("Report for Support Vector Machine:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Support Vector Machine:", _accuracy_score(test_set['likes'], predic)*100)
Report for Support Vector Machine:
              precision    recall  f1-score   support

           0       0.65      0.35      0.45     50930
           1       0.74      0.91      0.81    103063

    accuracy                           0.72    153993
   macro avg       0.70      0.63      0.63    153993
weighted avg       0.71      0.72      0.70    153993

Accuracy for Support Vector Machine: 72.28900014935743
In [26]:
# Confusion matrix for SVC

print("Confusion Matrix for SVC: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for SVC: 
Out[26]:
array([[17698, 33232],
       [ 9441, 93622]], dtype=int64)
In [27]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("SVM ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [28]:
_del_all()

7.5 Random forest without fake reviews

In [29]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.shape
Out[29]:
(558386, 787)
In [30]:
train_set = train_set[train_set['bin_truth_score']!=-1]
train_set.shape
Out[30]:
(438571, 787)
In [31]:
params = _jl.load("../models/best_Random_Forest_2.joblib").get_params()
params['n_jobs'] = -1
params['verbose'] = 10
best_model = _RandomForestClassifier(**params)
best_model.get_params()
Out[31]:
{'bootstrap': False,
 'class_weight': None,
 'criterion': 'entropy',
 'max_depth': 50,
 'max_features': 'auto',
 'max_leaf_nodes': None,
 'min_impurity_decrease': 0.0,
 'min_impurity_split': None,
 'min_samples_leaf': 2,
 'min_samples_split': 10,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 1000,
 'n_jobs': -1,
 'oob_score': False,
 'random_state': None,
 'verbose': 10,
 'warm_start': False}
In [32]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 12 concurrent workers.
building tree 1 of 1000
building tree 2 of 1000
building tree 3 of 1000
building tree 4 of 1000
building tree 5 of 1000
building tree 6 of 1000
building tree 7 of 1000
building tree 8 of 1000
building tree 9 of 1000
building tree 10 of 1000
building tree 11 of 1000
building tree 12 of 1000
building tree 13 of 1000
[Parallel(n_jobs=-1)]: Done   1 tasks      | elapsed:   16.4s
building tree 14 of 1000
building tree 15 of 1000
building tree 16 of 1000
building tree 17 of 1000
building tree 18 of 1000
building tree 19 of 1000
building tree 20 of 1000
building tree 21 of 1000
[Parallel(n_jobs=-1)]: Done   8 tasks      | elapsed:   20.2s
building tree 22 of 1000
building tree 23 of 1000
building tree 24 of 1000
building tree 25 of 1000
building tree 26 of 1000
building tree 27 of 1000
building tree 28 of 1000
[Parallel(n_jobs=-1)]: Done  17 tasks      | elapsed:   37.7s
building tree 29 of 1000
building tree 30 of 1000
building tree 31 of 1000
building tree 32 of 1000
building tree 33 of 1000
building tree 34 of 1000
building tree 35 of 1000
building tree 36 of 1000
building tree 37 of 1000
[Parallel(n_jobs=-1)]: Done  26 tasks      | elapsed:   55.7s
building tree 38 of 1000
building tree 39 of 1000
building tree 40 of 1000
building tree 41 of 1000
building tree 42 of 1000
building tree 43 of 1000
building tree 44 of 1000
building tree 45 of 1000
building tree 46 of 1000
building tree 47 of 1000
building tree 48 of 1000
building tree 49 of 1000
building tree 50 of 1000
[Parallel(n_jobs=-1)]: Done  37 tasks      | elapsed:  1.2min
building tree 51 of 1000
building tree 52 of 1000
building tree 53 of 1000
building tree 54 of 1000
building tree 55 of 1000
building tree 56 of 1000
building tree 57 of 1000
building tree 58 of 1000
building tree 59 of 1000
[Parallel(n_jobs=-1)]: Done  48 tasks      | elapsed:  1.3min
building tree 60 of 1000
building tree 61 of 1000
building tree 62 of 1000
building tree 63 of 1000
building tree 64 of 1000
building tree 65 of 1000
building tree 66 of 1000
building tree 67 of 1000
building tree 68 of 1000
building tree 69 of 1000
building tree 70 of 1000
building tree 71 of 1000
building tree 72 of 1000
[Parallel(n_jobs=-1)]: Done  61 tasks      | elapsed:  1.9min
building tree 73 of 1000
building tree 74 of 1000
building tree 75 of 1000
building tree 76 of 1000
building tree 77 of 1000
building tree 78 of 1000
building tree 79 of 1000
building tree 80 of 1000
building tree 81 of 1000
building tree 82 of 1000
building tree 83 of 1000
building tree 84 of 1000
building tree 85 of 1000
building tree 86 of 1000
[Parallel(n_jobs=-1)]: Done  74 tasks      | elapsed:  2.2min
building tree 87 of 1000
building tree 88 of 1000
building tree 89 of 1000
building tree 90 of 1000
building tree 91 of 1000
building tree 92 of 1000
building tree 93 of 1000
building tree 94 of 1000
building tree 95 of 1000
building tree 96 of 1000
building tree 97 of 1000
building tree 98 of 1000
building tree 99 of 1000
building tree 100 of 1000
[Parallel(n_jobs=-1)]: Done  89 tasks      | elapsed:  2.5min
building tree 101 of 1000
building tree 102 of 1000
building tree 103 of 1000
building tree 104 of 1000
building tree 105 of 1000
building tree 106 of 1000
building tree 107 of 1000
building tree 108 of 1000
building tree 109 of 1000
building tree 110 of 1000
building tree 111 of 1000
building tree 112 of 1000
building tree 113 of 1000
building tree 114 of 1000building tree 115 of 1000

building tree 116 of 1000
[Parallel(n_jobs=-1)]: Done 104 tasks      | elapsed:  2.9min
building tree 117 of 1000
building tree 118 of 1000
building tree 119 of 1000
building tree 120 of 1000
building tree 121 of 1000
building tree 122 of 1000
building tree 123 of 1000
building tree 124 of 1000
building tree 125 of 1000
building tree 126 of 1000
building tree 127 of 1000
building tree 128 of 1000
building tree 129 of 1000
building tree 130 of 1000
building tree 131 of 1000
building tree 132 of 1000
[Parallel(n_jobs=-1)]: Done 121 tasks      | elapsed:  3.4min
building tree 133 of 1000
building tree 134 of 1000
building tree 135 of 1000
building tree 136 of 1000
building tree 137 of 1000
building tree 138 of 1000
building tree 139 of 1000
building tree 140 of 1000
building tree 141 of 1000
building tree 142 of 1000
building tree 143 of 1000
building tree 144 of 1000
building tree 145 of 1000
building tree 146 of 1000
building tree 147 of 1000
building tree 148 of 1000
building tree 149 of 1000
[Parallel(n_jobs=-1)]: Done 138 tasks      | elapsed:  3.9min
building tree 150 of 1000
building tree 151 of 1000
building tree 152 of 1000
building tree 153 of 1000
building tree 154 of 1000
building tree 155 of 1000
building tree 156 of 1000
building tree 157 of 1000
building tree 158 of 1000
building tree 159 of 1000
building tree 160 of 1000
building tree 161 of 1000
building tree 162 of 1000
building tree 163 of 1000
building tree 164 of 1000
building tree 165 of 1000
building tree 166 of 1000
building tree 167 of 1000
building tree 168 of 1000
[Parallel(n_jobs=-1)]: Done 157 tasks      | elapsed:  4.4min
building tree 169 of 1000
building tree 170 of 1000
building tree 171 of 1000
building tree 172 of 1000
building tree 173 of 1000
building tree 174 of 1000
building tree 175 of 1000
building tree 176 of 1000
building tree 177 of 1000
building tree 178 of 1000
building tree 179 of 1000
building tree 180 of 1000
building tree 181 of 1000
building tree 182 of 1000
building tree 183 of 1000
building tree 184 of 1000
building tree 185 of 1000
building tree 186 of 1000
building tree 187 of 1000
[Parallel(n_jobs=-1)]: Done 176 tasks      | elapsed:  4.9min
building tree 188 of 1000
building tree 189 of 1000
building tree 190 of 1000
building tree 191 of 1000
building tree 192 of 1000
building tree 193 of 1000
building tree 194 of 1000
building tree 195 of 1000
building tree 196 of 1000
building tree 197 of 1000
building tree 198 of 1000
building tree 199 of 1000
building tree 200 of 1000
building tree 201 of 1000
building tree 202 of 1000
building tree 203 of 1000
building tree 204 of 1000
building tree 205 of 1000
building tree 206 of 1000
building tree 207 of 1000
building tree 208 of 1000
[Parallel(n_jobs=-1)]: Done 197 tasks      | elapsed:  5.5min
building tree 209 of 1000
building tree 210 of 1000
building tree 211 of 1000
building tree 212 of 1000
building tree 213 of 1000
building tree 214 of 1000
building tree 215 of 1000
building tree 216 of 1000
building tree 217 of 1000
building tree 218 of 1000
building tree 219 of 1000
building tree 220 of 1000
building tree 221 of 1000
building tree 222 of 1000
building tree 223 of 1000
building tree 224 of 1000
building tree 225 of 1000
building tree 226 of 1000
building tree 227 of 1000
building tree 228 of 1000
building tree 229 of 1000
[Parallel(n_jobs=-1)]: Done 218 tasks      | elapsed:  6.1min
building tree 230 of 1000
building tree 231 of 1000
building tree 232 of 1000
building tree 233 of 1000
building tree 234 of 1000
building tree 235 of 1000
building tree 236 of 1000
building tree 237 of 1000
building tree 238 of 1000
building tree 239 of 1000
building tree 240 of 1000
building tree 241 of 1000
building tree 242 of 1000
building tree 243 of 1000
building tree 244 of 1000
building tree 245 of 1000
building tree 246 of 1000
building tree 247 of 1000
building tree 248 of 1000
building tree 249 of 1000
building tree 250 of 1000
building tree 251 of 1000
building tree 252 of 1000
[Parallel(n_jobs=-1)]: Done 241 tasks      | elapsed:  6.8min
building tree 253 of 1000
building tree 254 of 1000
building tree 255 of 1000
building tree 256 of 1000
building tree 257 of 1000
building tree 258 of 1000
building tree 259 of 1000
building tree 260 of 1000
building tree 261 of 1000
building tree 262 of 1000
building tree 263 of 1000
building tree 264 of 1000
building tree 265 of 1000
building tree 266 of 1000
building tree 267 of 1000
building tree 268 of 1000
building tree 269 of 1000
building tree 270 of 1000
building tree 271 of 1000
building tree 272 of 1000
building tree 273 of 1000
building tree 274 of 1000
building tree 275 of 1000
[Parallel(n_jobs=-1)]: Done 264 tasks      | elapsed:  7.3min
building tree 276 of 1000
building tree 277 of 1000
building tree 278 of 1000
building tree 279 of 1000
building tree 280 of 1000
building tree 281 of 1000
building tree 282 of 1000
building tree 283 of 1000
building tree 284 of 1000
building tree 285 of 1000
building tree 286 of 1000
building tree 287 of 1000
building tree 288 of 1000
building tree 289 of 1000
building tree 290 of 1000
building tree 291 of 1000
building tree 292 of 1000
building tree 293 of 1000
building tree 294 of 1000
building tree 295 of 1000
building tree 296 of 1000
building tree 297 of 1000
building tree 298 of 1000
building tree 299 of 1000
building tree 300 of 1000
building tree 301 of 1000
[Parallel(n_jobs=-1)]: Done 289 tasks      | elapsed:  8.1min
building tree 302 of 1000
building tree 303 of 1000
building tree 304 of 1000
building tree 305 of 1000
building tree 306 of 1000
building tree 307 of 1000
building tree 308 of 1000
building tree 309 of 1000
building tree 310 of 1000
building tree 311 of 1000
building tree 312 of 1000
building tree 313 of 1000
building tree 314 of 1000
building tree 315 of 1000
building tree 316 of 1000
building tree 317 of 1000
building tree 318 of 1000
building tree 319 of 1000
building tree 320 of 1000
building tree 321 of 1000
building tree 322 of 1000
building tree 323 of 1000
building tree 324 of 1000
building tree 325 of 1000
building tree 326 of 1000
[Parallel(n_jobs=-1)]: Done 314 tasks      | elapsed:  8.7min
building tree 327 of 1000
building tree 328 of 1000
building tree 329 of 1000
building tree 330 of 1000
building tree 331 of 1000
building tree 332 of 1000
building tree 333 of 1000
building tree 334 of 1000
building tree 335 of 1000
building tree 336 of 1000
building tree 337 of 1000
building tree 338 of 1000
building tree 339 of 1000
building tree 340 of 1000
building tree 341 of 1000
building tree 342 of 1000
building tree 343 of 1000
building tree 344 of 1000
building tree 345 of 1000
building tree 346 of 1000
building tree 347 of 1000
building tree 348 of 1000
building tree 349 of 1000
building tree 350 of 1000
building tree 351 of 1000
building tree 352 of 1000
[Parallel(n_jobs=-1)]: Done 341 tasks      | elapsed:  9.5min
building tree 353 of 1000
building tree 354 of 1000
building tree 355 of 1000
building tree 356 of 1000
building tree 357 of 1000
building tree 358 of 1000
building tree 359 of 1000
building tree 360 of 1000
building tree 361 of 1000
building tree 362 of 1000
building tree 363 of 1000
building tree 364 of 1000
building tree 365 of 1000
building tree 366 of 1000
building tree 367 of 1000
building tree 368 of 1000
building tree 369 of 1000
building tree 370 of 1000
building tree 371 of 1000
building tree 372 of 1000
building tree 373 of 1000
building tree 374 of 1000
building tree 375 of 1000
building tree 376 of 1000
building tree 377 of 1000
building tree 378 of 1000
building tree 379 of 1000
[Parallel(n_jobs=-1)]: Done 368 tasks      | elapsed: 10.2min
building tree 380 of 1000
building tree 381 of 1000
building tree 382 of 1000
building tree 383 of 1000
building tree 384 of 1000
building tree 385 of 1000
building tree 386 of 1000
building tree 387 of 1000
building tree 388 of 1000
building tree 389 of 1000
building tree 390 of 1000
building tree 391 of 1000
building tree 392 of 1000
building tree 393 of 1000
building tree 394 of 1000
building tree 395 of 1000
building tree 396 of 1000
building tree 397 of 1000
building tree 398 of 1000
building tree 399 of 1000
building tree 400 of 1000
building tree 401 of 1000
building tree 402 of 1000
building tree 403 of 1000
building tree 404 of 1000
building tree 405 of 1000
building tree 406 of 1000
building tree 407 of 1000
building tree 408 of 1000
[Parallel(n_jobs=-1)]: Done 397 tasks      | elapsed: 11.0min
building tree 409 of 1000
building tree 410 of 1000
building tree 411 of 1000
building tree 412 of 1000
building tree 413 of 1000
building tree 414 of 1000
building tree 415 of 1000
building tree 416 of 1000
building tree 417 of 1000
building tree 418 of 1000
building tree 419 of 1000
building tree 420 of 1000
building tree 421 of 1000
building tree 422 of 1000
building tree 423 of 1000
building tree 424 of 1000
building tree 425 of 1000
building tree 426 of 1000
building tree 427 of 1000
building tree 428 of 1000
building tree 429 of 1000
building tree 430 of 1000
building tree 431 of 1000
building tree 432 of 1000
building tree 433 of 1000
building tree 434 of 1000
building tree 435 of 1000
building tree 436 of 1000
building tree 437 of 1000
building tree 438 of 1000
[Parallel(n_jobs=-1)]: Done 426 tasks      | elapsed: 11.8min
building tree 439 of 1000
building tree 440 of 1000
building tree 441 of 1000
building tree 442 of 1000
building tree 443 of 1000
building tree 444 of 1000
building tree 445 of 1000
building tree 446 of 1000
building tree 447 of 1000
building tree 448 of 1000
building tree 449 of 1000
building tree 450 of 1000
building tree 451 of 1000
building tree 452 of 1000
building tree 453 of 1000
building tree 454 of 1000
building tree 455 of 1000
building tree 456 of 1000
building tree 457 of 1000
building tree 458 of 1000
building tree 459 of 1000
building tree 460 of 1000
building tree 461 of 1000
building tree 462 of 1000
building tree 463 of 1000
building tree 464 of 1000
building tree 465 of 1000
building tree 466 of 1000
building tree 467 of 1000
building tree 468 of 1000
[Parallel(n_jobs=-1)]: Done 457 tasks      | elapsed: 12.7min
building tree 469 of 1000
building tree 470 of 1000
building tree 471 of 1000
building tree 472 of 1000
building tree 473 of 1000
building tree 474 of 1000
building tree 475 of 1000
building tree 476 of 1000
building tree 477 of 1000
building tree 478 of 1000
building tree 479 of 1000
building tree 480 of 1000
building tree 481 of 1000
building tree 482 of 1000
building tree 483 of 1000
building tree 484 of 1000
building tree 485 of 1000
building tree 486 of 1000
building tree 487 of 1000
building tree 488 of 1000
building tree 489 of 1000
building tree 490 of 1000
building tree 491 of 1000
building tree 492 of 1000
building tree 493 of 1000
building tree 494 of 1000
building tree 495 of 1000
building tree 496 of 1000
building tree 497 of 1000
building tree 498 of 1000
building tree 499 of 1000
[Parallel(n_jobs=-1)]: Done 488 tasks      | elapsed: 13.5min
building tree 500 of 1000
building tree 501 of 1000
building tree 502 of 1000
building tree 503 of 1000
building tree 504 of 1000
building tree 505 of 1000
building tree 506 of 1000
building tree 507 of 1000
building tree 508 of 1000
building tree 509 of 1000
building tree 510 of 1000
building tree 511 of 1000
building tree 512 of 1000
building tree 513 of 1000
building tree 514 of 1000
building tree 515 of 1000
building tree 516 of 1000
building tree 517 of 1000
building tree 518 of 1000
building tree 519 of 1000
building tree 520 of 1000
building tree 521 of 1000
building tree 522 of 1000
building tree 523 of 1000
building tree 524 of 1000
building tree 525 of 1000
building tree 526 of 1000
building tree 527 of 1000
building tree 528 of 1000
building tree 529 of 1000
building tree 530 of 1000
building tree 531 of 1000
building tree 532 of 1000
building tree 533 of 1000
[Parallel(n_jobs=-1)]: Done 521 tasks      | elapsed: 14.4min
building tree 534 of 1000
building tree 535 of 1000
building tree 536 of 1000
building tree 537 of 1000
building tree 538 of 1000
building tree 539 of 1000
building tree 540 of 1000
building tree 541 of 1000
building tree 542 of 1000
building tree 543 of 1000
building tree 544 of 1000
building tree 545 of 1000
building tree 546 of 1000
building tree 547 of 1000
building tree 548 of 1000
building tree 549 of 1000
building tree 550 of 1000
building tree 551 of 1000
building tree 552 of 1000
building tree 553 of 1000
building tree 554 of 1000
building tree 555 of 1000
building tree 556 of 1000
building tree 557 of 1000
building tree 558 of 1000
building tree 559 of 1000
building tree 560 of 1000
building tree 561 of 1000
building tree 562 of 1000
building tree 563 of 1000
building tree 564 of 1000
building tree 565 of 1000
building tree 566 of 1000
building tree 567 of 1000
[Parallel(n_jobs=-1)]: Done 554 tasks      | elapsed: 15.3min
building tree 568 of 1000
building tree 569 of 1000
building tree 570 of 1000
building tree 571 of 1000
building tree 572 of 1000
building tree 573 of 1000
building tree 574 of 1000
building tree 575 of 1000
building tree 576 of 1000
building tree 577 of 1000
building tree 578 of 1000
building tree 579 of 1000
building tree 580 of 1000
building tree 581 of 1000
building tree 582 of 1000
building tree 583 of 1000
building tree 584 of 1000
building tree 585 of 1000
building tree 586 of 1000
building tree 587 of 1000
building tree 588 of 1000
building tree 589 of 1000
building tree 590 of 1000
building tree 591 of 1000
building tree 592 of 1000
building tree 593 of 1000
building tree 594 of 1000
building tree 595 of 1000
building tree 596 of 1000
building tree 597 of 1000
building tree 598 of 1000
building tree 599 of 1000
building tree 600 of 1000
[Parallel(n_jobs=-1)]: Done 589 tasks      | elapsed: 16.3min
building tree 601 of 1000
building tree 602 of 1000
building tree 603 of 1000
building tree 604 of 1000
building tree 605 of 1000
building tree 606 of 1000
building tree 607 of 1000
building tree 608 of 1000
building tree 609 of 1000
building tree 610 of 1000
building tree 611 of 1000
building tree 612 of 1000
building tree 613 of 1000
building tree 614 of 1000
building tree 615 of 1000
building tree 616 of 1000
building tree 617 of 1000
building tree 618 of 1000
building tree 619 of 1000
building tree 620 of 1000
building tree 621 of 1000
building tree 622 of 1000
building tree 623 of 1000
building tree 624 of 1000
building tree 625 of 1000
building tree 626 of 1000
building tree 627 of 1000
building tree 628 of 1000
building tree 629 of 1000
building tree 630 of 1000
building tree 631 of 1000
building tree 632 of 1000
building tree 633 of 1000
building tree 634 of 1000
building tree 635 of 1000
[Parallel(n_jobs=-1)]: Done 624 tasks      | elapsed: 17.2min
building tree 636 of 1000
building tree 637 of 1000
building tree 638 of 1000
building tree 639 of 1000
building tree 640 of 1000
building tree 641 of 1000
building tree 642 of 1000
building tree 643 of 1000
building tree 644 of 1000
building tree 645 of 1000
building tree 646 of 1000
building tree 647 of 1000
building tree 648 of 1000
building tree 649 of 1000
building tree 650 of 1000
building tree 651 of 1000
building tree 652 of 1000
building tree 653 of 1000
building tree 654 of 1000
building tree 655 of 1000
building tree 656 of 1000
building tree 657 of 1000
building tree 658 of 1000
building tree 659 of 1000
building tree 660 of 1000
building tree 661 of 1000
building tree 662 of 1000
building tree 663 of 1000
building tree 664 of 1000
building tree 665 of 1000
building tree 666 of 1000
building tree 667 of 1000
building tree 668 of 1000
building tree 669 of 1000
building tree 670 of 1000
building tree 671 of 1000
building tree 672 of 1000
[Parallel(n_jobs=-1)]: Done 661 tasks      | elapsed: 18.2min
building tree 673 of 1000
building tree 674 of 1000
building tree 675 of 1000
building tree 676 of 1000
building tree 677 of 1000
building tree 678 of 1000
building tree 679 of 1000
building tree 680 of 1000
building tree 681 of 1000
building tree 682 of 1000
building tree 683 of 1000
building tree 684 of 1000
building tree 685 of 1000
building tree 686 of 1000
building tree 687 of 1000
building tree 688 of 1000
building tree 689 of 1000
building tree 690 of 1000
building tree 691 of 1000
building tree 692 of 1000
building tree 693 of 1000
building tree 694 of 1000
building tree 695 of 1000
building tree 696 of 1000
building tree 697 of 1000
building tree 698 of 1000
building tree 699 of 1000
building tree 700 of 1000
building tree 701 of 1000
building tree 702 of 1000
building tree 703 of 1000
building tree 704 of 1000
building tree 705 of 1000
building tree 706 of 1000
building tree 707 of 1000
building tree 708 of 1000
building tree 709 of 1000
[Parallel(n_jobs=-1)]: Done 698 tasks      | elapsed: 19.2min
building tree 710 of 1000
building tree 711 of 1000
building tree 712 of 1000
building tree 713 of 1000
building tree 714 of 1000
building tree 715 of 1000
building tree 716 of 1000
building tree 717 of 1000
building tree 718 of 1000
building tree 719 of 1000
building tree 720 of 1000
building tree 721 of 1000
building tree 722 of 1000
building tree 723 of 1000
building tree 724 of 1000
building tree 725 of 1000
building tree 726 of 1000
building tree 727 of 1000
building tree 728 of 1000
building tree 729 of 1000
building tree 730 of 1000
building tree 731 of 1000
building tree 732 of 1000
building tree 733 of 1000
building tree 734 of 1000
building tree 735 of 1000
building tree 736 of 1000
building tree 737 of 1000
building tree 738 of 1000
building tree 739 of 1000
building tree 740 of 1000
building tree 741 of 1000
building tree 742 of 1000
building tree 743 of 1000
building tree 744 of 1000
building tree 745 of 1000
building tree 746 of 1000
building tree 747 of 1000
building tree 748 of 1000
[Parallel(n_jobs=-1)]: Done 737 tasks      | elapsed: 20.3min
building tree 749 of 1000
building tree 750 of 1000
building tree 751 of 1000
building tree 752 of 1000
building tree 753 of 1000
building tree 754 of 1000
building tree 755 of 1000
building tree 756 of 1000
building tree 757 of 1000
building tree 758 of 1000
building tree 759 of 1000
building tree 760 of 1000
building tree 761 of 1000
building tree 762 of 1000
building tree 763 of 1000
building tree 764 of 1000
building tree 765 of 1000
building tree 766 of 1000
building tree 767 of 1000
building tree 768 of 1000
building tree 769 of 1000
building tree 770 of 1000
building tree 771 of 1000
building tree 772 of 1000
building tree 773 of 1000
building tree 774 of 1000
building tree 775 of 1000
building tree 776 of 1000
building tree 777 of 1000
building tree 778 of 1000
building tree 779 of 1000
building tree 780 of 1000
building tree 781 of 1000
building tree 782 of 1000
building tree 783 of 1000
building tree 784 of 1000
building tree 785 of 1000
building tree 786 of 1000
building tree 787 of 1000
[Parallel(n_jobs=-1)]: Done 776 tasks      | elapsed: 21.3min
building tree 788 of 1000
building tree 789 of 1000
building tree 790 of 1000
building tree 791 of 1000
building tree 792 of 1000
building tree 793 of 1000
building tree 794 of 1000
building tree 795 of 1000
building tree 796 of 1000
building tree 797 of 1000
building tree 798 of 1000
building tree 799 of 1000
building tree 800 of 1000
building tree 801 of 1000
building tree 802 of 1000
building tree 803 of 1000
building tree 804 of 1000
building tree 805 of 1000
building tree 806 of 1000
building tree 807 of 1000
building tree 808 of 1000
building tree 809 of 1000
building tree 810 of 1000
building tree 811 of 1000
building tree 812 of 1000
building tree 813 of 1000
building tree 814 of 1000
building tree 815 of 1000
building tree 816 of 1000
building tree 817 of 1000
building tree 818 of 1000
building tree 819 of 1000
building tree 820 of 1000
building tree 821 of 1000
building tree 822 of 1000
building tree 823 of 1000
building tree 824 of 1000
building tree 825 of 1000
building tree 826 of 1000
building tree 827 of 1000
building tree 828 of 1000
[Parallel(n_jobs=-1)]: Done 817 tasks      | elapsed: 22.4min
building tree 829 of 1000
building tree 830 of 1000
building tree 831 of 1000
building tree 832 of 1000
building tree 833 of 1000
building tree 834 of 1000
building tree 835 of 1000
building tree 836 of 1000
building tree 837 of 1000
building tree 838 of 1000
building tree 839 of 1000
building tree 840 of 1000
building tree 841 of 1000
building tree 842 of 1000
building tree 843 of 1000
building tree 844 of 1000
building tree 845 of 1000
building tree 846 of 1000
building tree 847 of 1000
building tree 848 of 1000
building tree 849 of 1000
building tree 850 of 1000
building tree 851 of 1000
building tree 852 of 1000
building tree 853 of 1000
building tree 854 of 1000
building tree 855 of 1000
building tree 856 of 1000
building tree 857 of 1000
building tree 858 of 1000
building tree 859 of 1000
building tree 860 of 1000
building tree 861 of 1000
building tree 862 of 1000
building tree 863 of 1000
building tree 864 of 1000
building tree 865 of 1000
building tree 866 of 1000
building tree 867 of 1000
building tree 868 of 1000
building tree 869 of 1000
[Parallel(n_jobs=-1)]: Done 858 tasks      | elapsed: 23.6min
building tree 870 of 1000
building tree 871 of 1000
building tree 872 of 1000
building tree 873 of 1000
building tree 874 of 1000
building tree 875 of 1000
building tree 876 of 1000
building tree 877 of 1000
building tree 878 of 1000
building tree 879 of 1000
building tree 880 of 1000
building tree 881 of 1000
building tree 882 of 1000
building tree 883 of 1000
building tree 884 of 1000
building tree 885 of 1000
building tree 886 of 1000
building tree 887 of 1000
building tree 888 of 1000
building tree 889 of 1000
building tree 890 of 1000
building tree 891 of 1000
building tree 892 of 1000
building tree 893 of 1000
building tree 894 of 1000
building tree 895 of 1000
building tree 896 of 1000
building tree 897 of 1000
building tree 898 of 1000
building tree 899 of 1000
building tree 900 of 1000
building tree 901 of 1000
building tree 902 of 1000
building tree 903 of 1000
building tree 904 of 1000
building tree 905 of 1000
building tree 906 of 1000
building tree 907 of 1000
building tree 908 of 1000
building tree 909 of 1000
building tree 910 of 1000
building tree 911 of 1000
building tree 912 of 1000
[Parallel(n_jobs=-1)]: Done 901 tasks      | elapsed: 24.7min
building tree 913 of 1000
building tree 914 of 1000
building tree 915 of 1000
building tree 916 of 1000
building tree 917 of 1000
building tree 918 of 1000
building tree 919 of 1000
building tree 920 of 1000
building tree 921 of 1000
building tree 922 of 1000
building tree 923 of 1000
building tree 924 of 1000
building tree 925 of 1000
building tree 926 of 1000
building tree 927 of 1000
building tree 928 of 1000
building tree 929 of 1000
building tree 930 of 1000
building tree 931 of 1000
building tree 932 of 1000
building tree 933 of 1000
building tree 934 of 1000
building tree 935 of 1000
building tree 936 of 1000
building tree 937 of 1000
building tree 938 of 1000
building tree 939 of 1000
building tree 940 of 1000
building tree 941 of 1000
building tree 942 of 1000
building tree 943 of 1000
building tree 944 of 1000
building tree 945 of 1000
building tree 946 of 1000
building tree 947 of 1000
building tree 948 of 1000
building tree 949 of 1000
building tree 950 of 1000
building tree 951 of 1000
building tree 952 of 1000
building tree 953 of 1000
building tree 954 of 1000
building tree 955 of 1000
[Parallel(n_jobs=-1)]: Done 944 tasks      | elapsed: 25.9min
building tree 956 of 1000
building tree 957 of 1000
building tree 958 of 1000
building tree 959 of 1000
building tree 960 of 1000
building tree 961 of 1000
building tree 962 of 1000
building tree 963 of 1000
building tree 964 of 1000
building tree 965 of 1000
building tree 966 of 1000
building tree 967 of 1000
building tree 968 of 1000
building tree 969 of 1000
building tree 970 of 1000
building tree 971 of 1000
building tree 972 of 1000
building tree 973 of 1000
building tree 974 of 1000
building tree 975 of 1000
building tree 976 of 1000
building tree 977 of 1000
building tree 978 of 1000
building tree 979 of 1000
building tree 980 of 1000
building tree 981 of 1000
building tree 982 of 1000
building tree 983 of 1000
building tree 984 of 1000
building tree 985 of 1000
building tree 986 of 1000
building tree 987 of 1000
building tree 988 of 1000
building tree 989 of 1000
building tree 990 of 1000
building tree 991 of 1000
building tree 992 of 1000
building tree 993 of 1000
building tree 994 of 1000
building tree 995 of 1000
building tree 996 of 1000
building tree 997 of 1000
building tree 998 of 1000
building tree 999 of 1000
building tree 1000 of 1000
[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed: 27.3min finished
Out[32]:
RandomForestClassifier(bootstrap=False, class_weight=None, criterion='entropy',
                       max_depth=50, max_features='auto', max_leaf_nodes=None,
                       min_impurity_decrease=0.0, min_impurity_split=None,
                       min_samples_leaf=2, min_samples_split=10,
                       min_weight_fraction_leaf=0.0, n_estimators=1000,
                       n_jobs=-1, oob_score=False, random_state=None,
                       verbose=10, warm_start=False)
In [33]:
_jl.dump(best_model, "../models/best_Random_Forest_fake.joblib")
Out[33]:
['../models/best_Random_Forest_fake.joblib']
In [34]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set.shape
Out[34]:
(153993, 787)
In [35]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
[Parallel(n_jobs=12)]: Using backend ThreadingBackend with 12 concurrent workers.
[Parallel(n_jobs=12)]: Done   1 tasks      | elapsed:    0.1s
[Parallel(n_jobs=12)]: Done   8 tasks      | elapsed:    0.1s
[Parallel(n_jobs=12)]: Done  17 tasks      | elapsed:    0.4s
[Parallel(n_jobs=12)]: Done  26 tasks      | elapsed:    0.7s
[Parallel(n_jobs=12)]: Done  37 tasks      | elapsed:    1.0s
[Parallel(n_jobs=12)]: Done  48 tasks      | elapsed:    1.2s
[Parallel(n_jobs=12)]: Done  61 tasks      | elapsed:    1.6s
[Parallel(n_jobs=12)]: Done  74 tasks      | elapsed:    1.9s
[Parallel(n_jobs=12)]: Done  89 tasks      | elapsed:    2.2s
[Parallel(n_jobs=12)]: Done 104 tasks      | elapsed:    2.6s
[Parallel(n_jobs=12)]: Done 121 tasks      | elapsed:    3.0s
[Parallel(n_jobs=12)]: Done 138 tasks      | elapsed:    3.4s
[Parallel(n_jobs=12)]: Done 157 tasks      | elapsed:    3.9s
[Parallel(n_jobs=12)]: Done 176 tasks      | elapsed:    4.3s
[Parallel(n_jobs=12)]: Done 197 tasks      | elapsed:    4.9s
[Parallel(n_jobs=12)]: Done 218 tasks      | elapsed:    5.4s
[Parallel(n_jobs=12)]: Done 241 tasks      | elapsed:    6.0s
[Parallel(n_jobs=12)]: Done 264 tasks      | elapsed:    6.5s
[Parallel(n_jobs=12)]: Done 289 tasks      | elapsed:    7.2s
[Parallel(n_jobs=12)]: Done 314 tasks      | elapsed:    7.8s
[Parallel(n_jobs=12)]: Done 341 tasks      | elapsed:    8.5s
[Parallel(n_jobs=12)]: Done 368 tasks      | elapsed:    9.1s
[Parallel(n_jobs=12)]: Done 397 tasks      | elapsed:    9.8s
[Parallel(n_jobs=12)]: Done 426 tasks      | elapsed:   10.5s
[Parallel(n_jobs=12)]: Done 457 tasks      | elapsed:   11.3s
[Parallel(n_jobs=12)]: Done 488 tasks      | elapsed:   12.0s
[Parallel(n_jobs=12)]: Done 521 tasks      | elapsed:   12.9s
[Parallel(n_jobs=12)]: Done 554 tasks      | elapsed:   13.7s
[Parallel(n_jobs=12)]: Done 589 tasks      | elapsed:   14.6s
[Parallel(n_jobs=12)]: Done 624 tasks      | elapsed:   15.4s
[Parallel(n_jobs=12)]: Done 661 tasks      | elapsed:   16.3s
[Parallel(n_jobs=12)]: Done 698 tasks      | elapsed:   17.2s
[Parallel(n_jobs=12)]: Done 737 tasks      | elapsed:   18.2s
[Parallel(n_jobs=12)]: Done 776 tasks      | elapsed:   19.1s
[Parallel(n_jobs=12)]: Done 817 tasks      | elapsed:   20.1s
[Parallel(n_jobs=12)]: Done 858 tasks      | elapsed:   21.2s
[Parallel(n_jobs=12)]: Done 901 tasks      | elapsed:   22.2s
[Parallel(n_jobs=12)]: Done 944 tasks      | elapsed:   23.3s
predictions:
 [0 0 1 ... 0 0 0]
[Parallel(n_jobs=12)]: Done 1000 out of 1000 | elapsed:   24.5s finished
In [36]:
set(predic)
Out[36]:
{0, 1}
In [37]:
# evaluate classifier

print("Report for Random Forest classifier:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Random Forest Classifier:", _accuracy_score(test_set['likes'], predic)*100)
Report for Random Forest classifier:
              precision    recall  f1-score   support

           0       0.68      0.41      0.51     50930
           1       0.76      0.90      0.82    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.66      0.67    153993
weighted avg       0.73      0.74      0.72    153993

Accuracy for Random Forest Classifier: 74.14947432675511
In [38]:
# Confusion matrix for Random Forest

print("Confusion Matrix for Random Forest: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for Random Forest: 
Out[38]:
array([[21058, 29872],
       [ 9936, 93127]], dtype=int64)
In [39]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("Random Forest ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [40]:
_del_all()

7.6 Neural Network without fake reviews

In [3]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.shape
Out[3]:
(558386, 787)
In [4]:
train_set = train_set[train_set['bin_truth_score']!=-1]
train_set.shape
Out[4]:
(438571, 787)
In [5]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 7

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
80
In [6]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [7]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 306999 samples, validate on 131572 samples
Epoch 1/100
306999/306999 [==============================] - 30s 98us/step - loss: 0.6015 - acc: 0.6869 - val_loss: 0.5663 - val_acc: 0.7137
Epoch 2/100
306999/306999 [==============================] - 24s 79us/step - loss: 0.5644 - acc: 0.7154 - val_loss: 0.5518 - val_acc: 0.7288
Epoch 3/100
306999/306999 [==============================] - 24s 80us/step - loss: 0.5541 - acc: 0.7242 - val_loss: 0.5727 - val_acc: 0.7173
Epoch 4/100
306999/306999 [==============================] - 24s 77us/step - loss: 0.5498 - acc: 0.7276 - val_loss: 0.5402 - val_acc: 0.7353
Epoch 5/100
306999/306999 [==============================] - 25s 82us/step - loss: 0.5467 - acc: 0.7291 - val_loss: 0.5481 - val_acc: 0.7297
Epoch 6/100
306999/306999 [==============================] - 25s 80us/step - loss: 0.5438 - acc: 0.7325 - val_loss: 0.5427 - val_acc: 0.7352
Epoch 7/100
306999/306999 [==============================] - 24s 78us/step - loss: 0.5427 - acc: 0.7321 - val_loss: 0.5657 - val_acc: 0.7258
Epoch 8/100
306999/306999 [==============================] - 23s 74us/step - loss: 0.5409 - acc: 0.7333 - val_loss: 0.5452 - val_acc: 0.7307
Epoch 9/100
306999/306999 [==============================] - 25s 82us/step - loss: 0.5399 - acc: 0.7343 - val_loss: 0.5414 - val_acc: 0.7344
Epoch 10/100
306999/306999 [==============================] - 24s 77us/step - loss: 0.5380 - acc: 0.7360 - val_loss: 0.5394 - val_acc: 0.7367
Epoch 11/100
306999/306999 [==============================] - 23s 75us/step - loss: 0.5369 - acc: 0.7362 - val_loss: 0.5342 - val_acc: 0.7394
Epoch 12/100
306999/306999 [==============================] - 22s 73us/step - loss: 0.5357 - acc: 0.7378 - val_loss: 0.5382 - val_acc: 0.7376
Epoch 13/100
306999/306999 [==============================] - 22s 73us/step - loss: 0.5353 - acc: 0.7379 - val_loss: 0.5364 - val_acc: 0.7400
Epoch 14/100
306999/306999 [==============================] - 23s 74us/step - loss: 0.5338 - acc: 0.7385 - val_loss: 0.5361 - val_acc: 0.7408
Epoch 15/100
306999/306999 [==============================] - 22s 72us/step - loss: 0.5335 - acc: 0.7395 - val_loss: 0.5489 - val_acc: 0.7342
Epoch 16/100
306999/306999 [==============================] - 21s 69us/step - loss: 0.5319 - acc: 0.7397 - val_loss: 0.5329 - val_acc: 0.7412
Epoch 17/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5318 - acc: 0.7403 - val_loss: 0.5357 - val_acc: 0.7415
Epoch 18/100
306999/306999 [==============================] - 21s 67us/step - loss: 0.5303 - acc: 0.7404 - val_loss: 0.5333 - val_acc: 0.7393
Epoch 19/100
306999/306999 [==============================] - 21s 67us/step - loss: 0.5301 - acc: 0.7416 - val_loss: 0.5410 - val_acc: 0.7401
Epoch 20/100
306999/306999 [==============================] - 21s 67us/step - loss: 0.5303 - acc: 0.7412 - val_loss: 0.5495 - val_acc: 0.7352
Epoch 21/100
306999/306999 [==============================] - 21s 67us/step - loss: 0.5288 - acc: 0.7425 - val_loss: 0.5292 - val_acc: 0.7440
Epoch 22/100
306999/306999 [==============================] - 21s 67us/step - loss: 0.5287 - acc: 0.7426 - val_loss: 0.5360 - val_acc: 0.7387
Epoch 23/100
306999/306999 [==============================] - 21s 67us/step - loss: 0.5281 - acc: 0.7431 - val_loss: 0.5454 - val_acc: 0.7360
Epoch 24/100
306999/306999 [==============================] - 21s 68us/step - loss: 0.5266 - acc: 0.7439 - val_loss: 0.5505 - val_acc: 0.7377
Epoch 25/100
306999/306999 [==============================] - 21s 69us/step - loss: 0.5270 - acc: 0.7433 - val_loss: 0.5461 - val_acc: 0.7369
Epoch 26/100
306999/306999 [==============================] - 21s 69us/step - loss: 0.5266 - acc: 0.7432 - val_loss: 0.5731 - val_acc: 0.7101
Epoch 27/100
306999/306999 [==============================] - 21s 69us/step - loss: 0.5292 - acc: 0.7417 - val_loss: 0.5589 - val_acc: 0.7399
Epoch 28/100
306999/306999 [==============================] - 22s 70us/step - loss: 0.5257 - acc: 0.7439 - val_loss: 0.5932 - val_acc: 0.7362
Epoch 29/100
306999/306999 [==============================] - 22s 72us/step - loss: 0.5253 - acc: 0.7445 - val_loss: 0.5859 - val_acc: 0.7417
Epoch 30/100
306999/306999 [==============================] - 24s 78us/step - loss: 0.5246 - acc: 0.7448 - val_loss: 0.5809 - val_acc: 0.7397
Epoch 31/100
306999/306999 [==============================] - 22s 72us/step - loss: 0.5274 - acc: 0.7438 - val_loss: 0.5833 - val_acc: 0.7371
Epoch 32/100
306999/306999 [==============================] - 22s 72us/step - loss: 0.5249 - acc: 0.7450 - val_loss: 0.5755 - val_acc: 0.7412
Epoch 33/100
306999/306999 [==============================] - 22s 72us/step - loss: 0.5246 - acc: 0.7448 - val_loss: 0.5392 - val_acc: 0.7373
Epoch 34/100
306999/306999 [==============================] - 22s 73us/step - loss: 0.5240 - acc: 0.7453 - val_loss: 0.5380 - val_acc: 0.7442
Epoch 35/100
306999/306999 [==============================] - 22s 72us/step - loss: 0.5233 - acc: 0.7457 - val_loss: 0.5578 - val_acc: 0.7397
Epoch 36/100
306999/306999 [==============================] - 22s 71us/step - loss: 0.5230 - acc: 0.7452 - val_loss: 0.5813 - val_acc: 0.7399
Epoch 37/100
306999/306999 [==============================] - 22s 70us/step - loss: 0.5224 - acc: 0.7457 - val_loss: 0.5773 - val_acc: 0.7428
Epoch 38/100
306999/306999 [==============================] - 22s 71us/step - loss: 0.5218 - acc: 0.7462 - val_loss: 0.5826 - val_acc: 0.7411
Epoch 39/100
306999/306999 [==============================] - 22s 70us/step - loss: 0.5226 - acc: 0.7463 - val_loss: 0.5797 - val_acc: 0.7433
Epoch 40/100
306999/306999 [==============================] - 22s 70us/step - loss: 0.5215 - acc: 0.7471 - val_loss: 0.5719 - val_acc: 0.7411
Epoch 41/100
306999/306999 [==============================] - 22s 70us/step - loss: 0.5230 - acc: 0.7455 - val_loss: 0.5542 - val_acc: 0.7255
Epoch 42/100
306999/306999 [==============================] - 21s 70us/step - loss: 0.5279 - acc: 0.7414 - val_loss: 0.5689 - val_acc: 0.7371
Epoch 43/100
306999/306999 [==============================] - 22s 71us/step - loss: 0.5207 - acc: 0.7465 - val_loss: 0.5660 - val_acc: 0.7420
Epoch 44/100
306999/306999 [==============================] - 21s 70us/step - loss: 0.5209 - acc: 0.7468 - val_loss: 0.5628 - val_acc: 0.7442
Epoch 45/100
306999/306999 [==============================] - 20s 67us/step - loss: 0.5210 - acc: 0.7471 - val_loss: 0.5751 - val_acc: 0.7384
Epoch 46/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5202 - acc: 0.7466 - val_loss: 0.5986 - val_acc: 0.7347
Epoch 47/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5201 - acc: 0.7471 - val_loss: 0.5670 - val_acc: 0.7426
Epoch 48/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5199 - acc: 0.7479 - val_loss: 0.5761 - val_acc: 0.7436
Epoch 49/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5192 - acc: 0.7479 - val_loss: 0.5831 - val_acc: 0.7419
Epoch 50/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5191 - acc: 0.7474 - val_loss: 0.5882 - val_acc: 0.7405
Epoch 51/100
306999/306999 [==============================] - 20s 67us/step - loss: 0.5198 - acc: 0.7475 - val_loss: 0.5773 - val_acc: 0.7406
Epoch 52/100
306999/306999 [==============================] - 20s 67us/step - loss: 0.5188 - acc: 0.7484 - val_loss: 0.5844 - val_acc: 0.7380
Epoch 53/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5183 - acc: 0.7494 - val_loss: 0.5901 - val_acc: 0.7401
Epoch 54/100
306999/306999 [==============================] - 20s 67us/step - loss: 0.5185 - acc: 0.7483 - val_loss: 0.5839 - val_acc: 0.7434
Epoch 55/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5180 - acc: 0.7492 - val_loss: 0.5861 - val_acc: 0.7441
Epoch 56/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5186 - acc: 0.7481 - val_loss: 0.5783 - val_acc: 0.7400
Epoch 57/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5187 - acc: 0.7484 - val_loss: 0.5845 - val_acc: 0.7431
Epoch 58/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5180 - acc: 0.7490 - val_loss: 0.6011 - val_acc: 0.7428
Epoch 59/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5171 - acc: 0.7492 - val_loss: 0.5731 - val_acc: 0.7454
Epoch 60/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5187 - acc: 0.7484 - val_loss: 0.6184 - val_acc: 0.7342
Epoch 61/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5175 - acc: 0.7487 - val_loss: 0.6044 - val_acc: 0.7372
Epoch 62/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5175 - acc: 0.7491 - val_loss: 0.5877 - val_acc: 0.7409
Epoch 63/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5161 - acc: 0.7503 - val_loss: 0.5970 - val_acc: 0.7435
Epoch 64/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5164 - acc: 0.7499 - val_loss: 0.6022 - val_acc: 0.7432
Epoch 65/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5177 - acc: 0.7486 - val_loss: 0.5970 - val_acc: 0.7408
Epoch 66/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5163 - acc: 0.7497 - val_loss: 0.6019 - val_acc: 0.7426
Epoch 67/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5160 - acc: 0.7499 - val_loss: 0.5921 - val_acc: 0.7435
Epoch 68/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5158 - acc: 0.7501 - val_loss: 0.5989 - val_acc: 0.7414
Epoch 69/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5160 - acc: 0.7502 - val_loss: 0.5861 - val_acc: 0.7439
Epoch 70/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5155 - acc: 0.7502 - val_loss: 0.5820 - val_acc: 0.7433
Epoch 71/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5156 - acc: 0.7501 - val_loss: 0.5895 - val_acc: 0.7413
Epoch 72/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5155 - acc: 0.7500 - val_loss: 0.5861 - val_acc: 0.7421
Epoch 73/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5150 - acc: 0.7513 - val_loss: 0.5808 - val_acc: 0.7405
Epoch 74/100
306999/306999 [==============================] - 21s 68us/step - loss: 0.5152 - acc: 0.7508 - val_loss: 0.5894 - val_acc: 0.7420
Epoch 75/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5145 - acc: 0.7511 - val_loss: 0.6106 - val_acc: 0.7419
Epoch 76/100
306999/306999 [==============================] - 20s 64us/step - loss: 0.5137 - acc: 0.7517 - val_loss: 0.5926 - val_acc: 0.7430
Epoch 77/100
306999/306999 [==============================] - 21s 67us/step - loss: 0.5140 - acc: 0.7510 - val_loss: 0.5935 - val_acc: 0.7418
Epoch 78/100
306999/306999 [==============================] - 22s 73us/step - loss: 0.5136 - acc: 0.7508 - val_loss: 0.5996 - val_acc: 0.7415
Epoch 79/100
306999/306999 [==============================] - 23s 74us/step - loss: 0.5137 - acc: 0.7514 - val_loss: 0.6083 - val_acc: 0.7355
Epoch 80/100
306999/306999 [==============================] - 24s 77us/step - loss: 0.5136 - acc: 0.7511 - val_loss: 0.6018 - val_acc: 0.7445
Epoch 81/100
306999/306999 [==============================] - 24s 77us/step - loss: 0.5134 - acc: 0.7516 - val_loss: 0.5918 - val_acc: 0.7459
Epoch 82/100
306999/306999 [==============================] - 22s 70us/step - loss: 0.5132 - acc: 0.7519 - val_loss: 0.5895 - val_acc: 0.7433
Epoch 83/100
306999/306999 [==============================] - 24s 79us/step - loss: 0.5129 - acc: 0.7516 - val_loss: 0.6002 - val_acc: 0.7399
Epoch 84/100
306999/306999 [==============================] - 24s 79us/step - loss: 0.5133 - acc: 0.7516 - val_loss: 0.6034 - val_acc: 0.7421
Epoch 85/100
306999/306999 [==============================] - 20s 67us/step - loss: 0.5130 - acc: 0.7516 - val_loss: 0.5826 - val_acc: 0.7426
Epoch 86/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5130 - acc: 0.7520 - val_loss: 0.5880 - val_acc: 0.7458
Epoch 87/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5125 - acc: 0.7524 - val_loss: 0.5789 - val_acc: 0.7446
Epoch 88/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5127 - acc: 0.7517 - val_loss: 0.5776 - val_acc: 0.7436
Epoch 89/100
306999/306999 [==============================] - 21s 68us/step - loss: 0.5120 - acc: 0.7520 - val_loss: 0.5866 - val_acc: 0.7444
Epoch 90/100
306999/306999 [==============================] - 21s 68us/step - loss: 0.5134 - acc: 0.7515 - val_loss: 0.5776 - val_acc: 0.7462
Epoch 91/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5118 - acc: 0.7524 - val_loss: 0.5934 - val_acc: 0.7398
Epoch 92/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5120 - acc: 0.7526 - val_loss: 0.5861 - val_acc: 0.7444
Epoch 93/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5121 - acc: 0.7516 - val_loss: 0.5867 - val_acc: 0.7470
Epoch 94/100
306999/306999 [==============================] - 20s 67us/step - loss: 0.5119 - acc: 0.7526 - val_loss: 0.5952 - val_acc: 0.7380
Epoch 95/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5118 - acc: 0.7521 - val_loss: 0.6002 - val_acc: 0.7436
Epoch 96/100
306999/306999 [==============================] - 20s 65us/step - loss: 0.5115 - acc: 0.7528 - val_loss: 0.5829 - val_acc: 0.7447
Epoch 97/100
306999/306999 [==============================] - 20s 66us/step - loss: 0.5111 - acc: 0.7534 - val_loss: 0.5996 - val_acc: 0.7415
Epoch 98/100
306999/306999 [==============================] - 23s 76us/step - loss: 0.5107 - acc: 0.7528 - val_loss: 0.5674 - val_acc: 0.7466
Epoch 99/100
306999/306999 [==============================] - 23s 74us/step - loss: 0.5108 - acc: 0.7530 - val_loss: 0.5815 - val_acc: 0.7454
Epoch 100/100
306999/306999 [==============================] - 23s 74us/step - loss: 0.5106 - acc: 0.7529 - val_loss: 0.5847 - val_acc: 0.7446
Out[7]:
<keras.callbacks.History at 0x1ad8c4c5b00>
In [8]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_fake.h5")
In [9]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
438571/438571 [==============================] - 12s 28us/step
[0.5348048495635159, 0.7506811895911467]
In [10]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set.shape
Out[10]:
(153993, 787)
In [11]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[11]:
array([[1.],
       [0.],
       [1.],
       ...,
       [0.],
       [1.],
       [1.]], dtype=float32)
In [12]:
set(binary_prediction[:,0])
Out[12]:
{0.0, 1.0}
In [13]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.69      0.39      0.50     50930
           1       0.75      0.92      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.65      0.66    153993
weighted avg       0.73      0.74      0.72    153993

Accuracy for Deep Learning approach: 74.06700304559298
In [14]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[19674 31256]
 [ 8679 94384]]
In [15]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [16]:
_del_all()

7.7 SVM only standard columns

In [3]:
# columns we are going to remove from the dataset

_cols_std = ['cuisine_av_hist', 'coll_score', 'average_stars_review', 'num_reviews_review',
            'average_stars_user', 'num_reviews_user',
            'av_rat_chinese_cuisine', 'av_rat_japanese_cuisine', 'av_rat_mexican_cuisine', 'av_rat_italian_cuisine', 
            'av_rat_others_cuisine', 'av_rat_american_cuisine', 'av_rat_korean_cuisine', 'av_rat_mediterranean_cuisine',
            'av_rat_thai_cuisine', 'av_rat_asianfusion_cuisine']


_cols_bin = ['cuisine_av_hist_bin', 'bin_truth_score', 'coll_score_bin', 'average_stars_bin_review',
            'num_reviews_bin_review', 'average_stars_bin_user', 'num_reviews_bin_user',
            'av_rat_chinese_cuisine_bin', 'av_rat_japanese_cuisine_bin', 'av_rat_mexican_cuisine_bin', 
            'av_rat_italian_cuisine_bin', 'av_rat_others_cuisine_bin', 'av_rat_american_cuisine_bin', 
            'av_rat_korean_cuisine_bin', 'av_rat_mediterranean_cuisine_bin', 'av_rat_thai_cuisine_bin', 
            'av_rat_asianfusion_cuisine_bin']


_cols_real = ['cuisine_av_hist_real', 'real_truth_score', 'coll_score_real', 'average_stars_real_review',
             'num_reviews_real_review', 'average_stars_real_user', 'num_reviews_real_user',
             'av_rat_chinese_cuisine_real', 'av_rat_japanese_cuisine_real', 'av_rat_mexican_cuisine_real', 
             'av_rat_italian_cuisine_real', 'av_rat_others_cuisine_real', 'av_rat_american_cuisine_real', 
             'av_rat_korean_cuisine_real', 'av_rat_mediterranean_cuisine_real', 'av_rat_thai_cuisine_real', 
             'av_rat_asianfusion_cuisine_real']
In [4]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.shape
Out[4]:
(558386, 787)
In [5]:
train_set = train_set.drop(columns=[*_cols_bin, *_cols_real])
train_set.shape
Out[5]:
(558386, 753)
In [6]:
best_model = _jl.load("../models/best_SVM.joblib")
best_model.set_params(verbose=10)
best_model.get_params()
Out[6]:
{'C': 0.001,
 'class_weight': None,
 'dual': True,
 'fit_intercept': True,
 'intercept_scaling': 1,
 'loss': 'squared_hinge',
 'max_iter': 50000,
 'multi_class': 'ovr',
 'penalty': 'l2',
 'random_state': 0,
 'tol': 0.0001,
 'verbose': 10}
In [7]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
[LibLinear]
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\sklearn\svm\base.py:929: ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.
  "the number of iterations.", ConvergenceWarning)
Out[7]:
LinearSVC(C=0.001, class_weight=None, dual=True, fit_intercept=True,
          intercept_scaling=1, loss='squared_hinge', max_iter=50000,
          multi_class='ovr', penalty='l2', random_state=0, tol=0.0001,
          verbose=10)
In [8]:
_jl.dump(best_model, "../models/best_SVM_std.joblib")
Out[8]:
['../models/best_SVM_std.joblib']
In [9]:
print("coef:", best_model.coef_)
print("intercept:", best_model.intercept_)
coef: [[-1.45163681e-01 -2.06192678e-01  3.43101354e-01  9.50706829e-02
  -7.67267815e-02  1.75285027e-05 -3.59168241e-02  7.69028847e-03
  -3.37551424e-05  6.44365981e-01 -8.19044376e-05  9.98650771e-04
   2.85675869e-04 -8.07039934e-04 -5.88016148e-04 -6.42377233e-04
   1.10320520e-03 -2.13229419e-04 -1.36944603e-04  4.43540753e-04
  -4.35994723e-05  6.60548107e-05 -9.40240234e-06 -3.16915159e-04
  -1.36081275e-01  7.99684151e-05 -4.15239339e-02  6.73184431e-02
   3.00972016e-04  1.54171072e-02 -5.04502256e-04 -2.20501470e-04
   4.92195916e-05  1.81076061e-04  3.69069303e-05 -5.94465246e-04
   3.11871960e-02  2.33957370e-02  3.60430435e-02  4.00775537e-02
   5.32107117e-02  4.93676643e-02  1.37202769e-02  2.57261458e-02
   1.96609415e-02  4.11802748e-02 -1.30191311e-01 -9.69832815e-02
  -1.33655698e-01 -8.26716794e-02 -1.30493220e-01 -1.47665391e-01
  -1.32102583e-01 -1.12734326e-01 -1.15993381e-01 -1.25985560e-01
  -1.18534497e-01 -1.16310232e-01 -8.86037761e-02 -8.85625045e-02
  -9.81470552e-02 -8.55169544e-02 -9.25029457e-02 -9.31002298e-02
  -8.93448873e-02 -8.58822273e-02  6.08734051e-03  2.05795751e-02
   1.41080355e-02  2.39124916e-02  2.52071186e-02 -2.27900951e-02
   1.23261655e-02  3.62203871e-02 -5.55927439e-02  4.58146366e-02
   2.12876520e-02 -8.06569307e-02 -1.36249406e-02  3.25710384e-02
   2.78837538e-02  6.69934088e-02 -8.48508124e-03 -7.87466725e-02
  -1.35055624e-02 -3.24469542e-02 -3.84007855e-04 -1.56273648e-03
   1.46397663e-02  2.61100375e-02  4.14629939e-03  6.17941398e-03
   3.62395677e-01 -1.08990669e-02  1.53841378e-02 -9.16119301e-03
   1.96047295e-02  1.47209564e-02  4.01644957e-02 -2.06227493e-02
   2.23082188e-02 -1.90273869e-01  2.03107121e-02  5.93510277e-02
  -2.42890723e-02 -1.20876657e-01  6.00448898e-04 -9.27465261e-02
   2.47100759e-02 -2.73764890e-02 -5.80382982e-02  1.49861963e-02
   8.21008906e-03  1.22036178e-02 -8.81020521e-03  1.94235319e-02
  -6.96733312e-02  1.70162579e-02  1.91164345e-02  7.70633647e-03
   3.39672283e-02 -1.30166942e-01  5.28450257e-03 -1.07200769e-01
  -2.36470907e-02  2.43130012e-03 -5.27724141e-03 -1.38757559e-02
   7.08733692e-02  2.04349746e-02  3.16352746e-02  2.21865642e-02
  -1.02251176e-01  2.05783657e-02 -3.61030277e-02 -8.15191661e-02
   1.20569269e-02 -4.44393092e-02  3.24087647e-02 -1.95265101e-02
   6.17565705e-02  9.43867895e-02  1.74285353e-02  1.07599325e-02
   1.30084458e-02 -8.81609605e-03  3.87091663e-03 -4.50570854e-03
   1.50290373e-02  1.30466315e-01 -1.71555597e-02 -1.94424266e-02
  -1.41829606e-01 -1.03278840e-02 -2.79446846e-02  7.84417499e-03
   4.21330297e-02  7.30523616e-02 -2.78513184e-02  2.27285676e-02
   1.19320119e-01 -2.31984596e-02 -3.93233659e-02  3.08146120e-02
  -2.19267468e-02 -3.34173608e-02 -4.12674350e-03  5.30846615e-02
  -1.02563100e-02 -2.41766092e-02 -8.59153714e-02  1.26192469e-02
  -2.29300876e-02 -1.49246593e-02  8.37885241e-02 -1.87616542e-02
   2.35727338e-02  3.30342140e-03 -8.60404399e-02  3.09988524e-02
  -9.38913252e-03 -6.27554430e-02  1.23030711e-02 -8.10424577e-03
  -4.08190234e-03 -5.24827915e-03 -3.18628761e-03 -2.44255827e-02
   7.19578375e-02 -1.68246204e-02 -9.91665265e-03  1.83777613e-02
  -1.17397590e-01 -1.14693799e-02  1.21929127e-02  5.62237299e-03
  -1.46270968e-02 -9.23993972e-02  1.09020562e-02  1.86699249e-02
  -5.25853090e-02 -1.55957411e-02 -2.21546449e-04 -2.97786738e-02
   4.86462359e-02 -1.55229633e-02  4.18373542e-02  1.71467714e-02
  -5.19918848e-03  3.81667301e-02  3.48579483e-02 -4.92319566e-03
  -1.29767823e-02  4.92732989e-02  1.42219888e-02 -5.58502934e-02
  -3.24365777e-02 -3.71407945e-02 -3.41474181e-02 -9.26607323e-02
   6.07485058e-03 -7.19846901e-03 -3.02702141e-03 -8.06356475e-04
   2.36656810e-02  4.61743673e-03 -4.37871909e-02 -1.06015916e-02
   7.05669008e-02  1.75549103e-02  2.55293636e-02 -4.88283329e-02
  -4.75391122e-02  1.10785048e-02 -2.55689685e-02  6.79462775e-03
   4.92229083e-03  1.00597326e-02 -1.71726454e-02  3.02456881e-02
  -9.38494730e-03 -5.12942532e-02 -7.08734405e-03  1.65447502e-03
   1.13963471e-02 -3.87377211e-02 -2.93049321e-02 -1.51343551e-02
  -9.70864886e-04 -5.13886007e-02  8.47746494e-03  1.51853308e-02
   1.56862454e-02  1.48857768e-02  1.80193756e-02  1.56840457e-03
  -2.38286239e-03 -1.94318639e-02 -3.63279821e-03 -5.64898990e-03
   5.48698410e-03 -6.52145520e-03 -5.24582661e-02 -1.61122408e-03
  -4.09840236e-03  5.38503812e-03 -1.17971742e-02 -6.24407200e-02
  -6.15112432e-03  4.65114142e-02 -8.33013172e-03  3.04207318e-03
  -6.59838666e-02  1.26992765e-02  2.37586572e-03  4.03168509e-03
   2.63876042e-02  6.33459246e-03 -6.19889501e-03 -3.06410129e-02
  -5.82770709e-04 -1.21237923e-04  1.17629032e-02  1.74572679e-02
  -1.11831034e-02 -4.79644760e-02  9.79850843e-05  1.86125493e-02
   3.60662642e-03  2.21099674e-02 -3.18531006e-03  1.43292043e-02
  -1.58394074e-03 -4.88816456e-02  8.63630483e-03 -2.09455724e-02
   3.00491599e-02 -4.21879018e-02 -2.56323606e-02 -2.35893622e-03
   1.02321914e-02  2.15960707e-02 -2.22510931e-02 -2.32975547e-03
  -8.93205142e-02 -9.75105846e-02  5.86765283e-03  3.52497058e-03
  -5.26786065e-03  2.63459474e-02 -1.32625847e-02 -2.53487952e-03
   2.59372328e-03 -3.62331661e-02  5.27731603e-03 -1.67180664e-02
   3.75336453e-02  1.18794502e-02  4.98211223e-02 -1.39097098e-02
   1.69137552e-02  4.86807534e-02  6.35274356e-03  1.72525115e-02
  -7.02980518e-03  1.89758412e-02  8.32277890e-03  3.49231929e-03
   2.83064875e-03  6.27399920e-03  5.76510574e-02  1.62525003e-02
   1.99399239e-03  2.56095347e-03  1.79334613e-02 -2.02289848e-02
   2.24248422e-02  1.31479095e-02 -4.51964555e-03  1.34393935e-03
   8.20165971e-03  2.55749807e-03 -1.04731249e-03 -2.73817028e-02
   1.54034960e-02  7.17990583e-03  9.32720970e-03 -1.11028796e-02
  -1.14003566e-02 -4.53319447e-02 -1.15513923e-02  2.18462490e-02
   4.37686759e-02 -9.88918019e-03 -2.21166388e-03 -3.69104028e-02
   7.80945531e-03  1.19437844e-02 -3.57931051e-03  1.55373814e-02
  -7.98217588e-04  4.10792184e-03  3.21385425e-02  1.90861394e-02
   7.48471078e-03  2.09008427e-02 -3.20058693e-04  2.38828155e-02
  -1.60324453e-02  4.49793755e-03 -5.75234534e-02 -7.99101290e-03
  -2.09985879e-03 -1.27445527e-02 -1.31276495e-02  1.26768567e-02
   1.04755125e-02 -1.28173221e-02 -1.11561951e-02  2.10796938e-02
  -1.81560664e-02 -1.68085105e-02 -4.23834956e-02  1.03133064e-04
  -1.75092711e-03 -3.77780684e-03  1.76045984e-03 -1.04718589e-02
  -2.94780620e-02  1.46228167e-02 -2.13851279e-02  1.62646547e-02
  -3.62533898e-02 -2.79248386e-02  2.60336894e-02 -2.96256302e-02
   3.64294711e-02  4.01782015e-02 -1.15381863e-02 -9.34758284e-03
   3.02636863e-03 -4.61973800e-02 -1.43106402e-02  1.26470749e-02
  -1.79250218e-02  1.36500911e-02  6.55461631e-02 -8.78659972e-03
   2.05077333e-02 -3.16895781e-02  2.11768839e-03 -1.02883030e-03
   2.91978692e-03 -7.71149534e-04  8.08574147e-03  3.03699100e-02
  -5.42914411e-03 -2.66009296e-02 -7.69779587e-03 -6.34711234e-03
   2.06794028e-02  1.60466811e-02 -1.21135486e-02 -2.93338309e-03
  -2.53805631e-03 -2.97586051e-02  3.76067664e-04  4.76061734e-02
   1.82687395e-02 -1.47586696e-02 -1.72434028e-02  2.25708857e-02
   1.09903351e-02  1.79556823e-02 -2.06019069e-03  8.72050733e-03
   4.03168509e-03 -1.83472977e-03 -3.23866255e-03 -1.71672810e-04
   8.85174742e-03 -2.48307116e-02  2.26861094e-02  1.86370941e-03
  -5.40835197e-03  1.53280586e-02 -5.87940610e-02 -1.57940294e-02
   2.42943861e-02 -3.52404843e-02 -3.44167363e-02  6.25969029e-03
   7.36495219e-02 -6.52754991e-03  2.12075097e-02  1.23134272e-02
   2.33817581e-02  3.06627292e-02  7.93489692e-02  9.82910173e-03
   7.39245563e-03  1.20416792e-02  5.94771997e-03  1.10721685e-03
   5.56443605e-04 -4.19782527e-03 -5.23808454e-03 -1.13613990e-02
   2.73392763e-02 -2.39489952e-02  1.87873350e-02  1.13184848e-02
  -7.47975279e-03  3.60163882e-02 -2.47352058e-03 -2.73057798e-02
   1.19977320e-02 -1.55003681e-03  1.97790657e-02 -1.41076291e-02
  -8.15529979e-03  9.45926853e-03 -5.89448011e-03  2.15895408e-02
  -2.87217768e-02 -4.17266662e-02  5.28080035e-02  1.69717922e-03
  -2.82523414e-02  2.83681481e-02 -3.16756834e-02  1.62397233e-03
  -2.87693937e-02  2.39428680e-02 -2.19990863e-02  2.81463295e-03
   1.76802144e-02 -1.44572408e-02 -1.12283080e-03  7.26220147e-03
  -1.66535437e-03  2.05374763e-02 -3.53192241e-02 -1.88999403e-02
  -7.54961289e-03  1.46910010e-02  3.47818228e-02  3.98925887e-03
  -3.55625057e-02  2.31097459e-02 -4.34346282e-03 -1.32456198e-02
  -5.10248277e-02  4.56924066e-02 -4.85704689e-03  2.26144912e-02
   9.84353236e-03 -1.05166838e-02  2.32399374e-03 -3.03348354e-02
   2.93245043e-04 -4.65186267e-02  2.42918512e-02  5.39311475e-03
   1.20426883e-02  2.30813009e-02  2.62066536e-02  1.25079118e-02
  -2.12882894e-03  2.02904988e-02 -1.31141925e-02  2.37712078e-02
  -4.68004937e-02 -9.63250959e-04  8.29524536e-04 -1.46294253e-02
  -3.42737072e-02  3.02323993e-02  7.60810040e-03  4.83546051e-03
  -3.85942774e-02  1.70927471e-02  5.32095535e-03  1.24478457e-02
  -7.69370484e-03  1.60844295e-03  1.65410658e-03  5.38865552e-03
  -1.93543373e-02 -1.89373202e-03 -3.53189618e-04 -3.95985864e-02
  -3.19007089e-02 -9.42702217e-04 -3.43002747e-02  8.29840710e-03
  -2.31106105e-03  3.05407410e-02 -3.09533734e-02  6.54975585e-03
   0.00000000e+00 -1.35699369e-02  2.26533499e-03 -8.23491702e-03
   3.07512672e-03  3.16887105e-03 -2.64596838e-02 -5.75086219e-03
   5.26037966e-03  8.10317374e-03 -2.01879255e-02  5.88776060e-03
   9.42822804e-03 -1.11042564e-03 -4.18525910e-02 -1.05065751e-02
  -5.65779070e-03 -3.04874251e-03 -3.15533008e-02  8.28457531e-03
   2.23825082e-02  5.43544811e-03  2.19812500e-02 -3.59679807e-02
  -5.93539652e-03  2.86454153e-02 -1.40148849e-02  2.78413217e-02
  -4.77900579e-03  5.98548226e-03  3.53573210e-02 -3.65877840e-02
   1.96629870e-04 -1.18453721e-02 -1.34378108e-02 -3.01776304e-03
  -2.69667338e-02 -1.30932907e-03 -2.08677777e-02 -4.97819246e-03
  -2.13434891e-02 -1.19103945e-02  8.72717669e-03 -1.00409073e-03
   2.14271770e-03 -2.84418287e-02  5.71533918e-04  4.74604922e-02
   1.79895276e-03 -1.30045547e-02  5.43018082e-03 -6.92303704e-03
  -5.00501592e-02 -5.57593196e-02  7.84015193e-02  1.42319373e-02
   2.09516675e-02 -2.17050750e-03 -1.48057402e-02 -4.37628362e-03
   3.64140296e-03 -3.13656581e-02  1.23883263e-02 -1.11975056e-02
   0.00000000e+00  3.36492141e-03 -3.85754811e-02 -3.24125403e-03
  -3.35970132e-02 -4.26739329e-03  7.26938637e-03  3.34811871e-03
  -8.42574043e-03  2.38519007e-03 -5.86873734e-03 -3.20123998e-02
  -1.37741663e-02  1.01872997e-02  2.61936145e-03 -7.60006872e-04
  -2.10422733e-02 -2.67909105e-02 -1.30986999e-03 -1.64830501e-02
  -3.30349047e-02 -4.43917466e-03 -6.25434683e-03 -3.80937199e-04
   1.72541077e-03  6.82243224e-03  3.84294714e-04  1.42286898e-02
   3.10975022e-03  5.38613989e-03 -1.90145442e-03  4.15426267e-03
  -3.16312678e-02 -6.14377429e-02 -4.88267373e-02  3.21877339e-02
   4.76579631e-03  7.07853373e-05  9.87813003e-04  2.00657189e-03
  -2.68148319e-03  1.64774161e-02  2.23684921e-03 -4.95871863e-03
   9.93886413e-03  6.56974139e-03  1.04857756e-02  3.50319806e-03
  -4.74414277e-03  6.23655750e-04  6.68540992e-03  3.62368959e-02
  -5.04601765e-03 -1.27860877e-02 -4.53196564e-03  1.33891206e-02
   9.05215933e-03  1.65056530e-02  4.46993948e-03 -7.02486665e-03
  -1.05628404e-02  2.56795794e-02  1.64333464e-03  2.58463523e-02
   3.18329206e-02 -6.48480101e-04  1.36059796e-02 -1.58513158e-02
  -1.98690304e-03  1.08848003e-02 -4.16122045e-03  8.62606662e-03
  -1.40543879e-03  1.02193609e-02 -3.12528080e-02  2.11443671e-03
   4.22582619e-05  7.78117494e-03 -2.03078324e-02 -2.35145766e-02
  -1.48804847e-02 -3.76437508e-02  1.62879908e-02  2.54061169e-02
   7.23435225e-03  2.19549562e-02  1.16170832e-02  2.49230240e-02
  -2.29370719e-02 -7.58959791e-03 -3.46521558e-03  2.90983356e-02
  -1.09027779e-03  2.17484259e-02 -1.40010872e-02  9.87666984e-03
   1.71227705e-03  1.70408978e-02 -1.66326212e-02 -2.45024234e-03
   3.42578814e-02 -8.91890662e-04  9.07247035e-03 -1.96779982e-02
  -2.74027447e-03  1.24231076e-02  3.84661077e-02 -2.49017355e-02
  -2.72390968e-02  9.47283420e-04  1.19708791e-03  1.70745052e-02
   4.77623195e-04  7.92084136e-03 -4.39068101e-02 -4.61390943e-03
   1.89044803e-02  2.34377255e-02  3.11315514e-02  0.00000000e+00
  -6.41677468e-03  2.29271606e-03  3.38308568e-03 -6.46594197e-03]]
intercept: [-0.36083029]
In [10]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set = test_set.drop(columns=[*_cols_bin, *_cols_real])
test_set.shape
Out[10]:
(153993, 753)
In [11]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
predictions:
 [1 0 1 ... 0 0 0]
In [12]:
set(predic)
Out[12]:
{0, 1}
In [13]:
# evaluate classifier

print("Report for Support Vector Machine:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Support Vector Machine:", _accuracy_score(test_set['likes'], predic)*100)
Report for Support Vector Machine:
              precision    recall  f1-score   support

           0       0.70      0.36      0.48     50930
           1       0.75      0.92      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.64      0.65    153993
weighted avg       0.73      0.74      0.71    153993

Accuracy for Support Vector Machine: 73.79556213594124
In [14]:
# Confusion matrix for SVC

print("Confusion Matrix for SVC: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for SVC: 
Out[14]:
array([[18342, 32588],
       [ 7765, 95298]], dtype=int64)
In [15]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("SVM ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [16]:
_del_all()

7.8 Random forest only standard columns

In [17]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.shape
Out[17]:
(558386, 787)
In [18]:
train_set = train_set.drop(columns=[*_cols_bin, *_cols_real])
train_set.shape
Out[18]:
(558386, 753)
In [19]:
params = _jl.load("../models/best_Random_Forest_2.joblib").get_params()
params['n_jobs'] = -1
params['verbose'] = 10
best_model = _RandomForestClassifier(**params)
best_model.get_params()
Out[19]:
{'bootstrap': False,
 'class_weight': None,
 'criterion': 'entropy',
 'max_depth': 50,
 'max_features': 'auto',
 'max_leaf_nodes': None,
 'min_impurity_decrease': 0.0,
 'min_impurity_split': None,
 'min_samples_leaf': 2,
 'min_samples_split': 10,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 1000,
 'n_jobs': -1,
 'oob_score': False,
 'random_state': None,
 'verbose': 10,
 'warm_start': False}
In [20]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 12 concurrent workers.
building tree 1 of 1000building tree 2 of 1000
building tree 3 of 1000

building tree 4 of 1000
building tree 5 of 1000
building tree 6 of 1000
building tree 7 of 1000
building tree 8 of 1000
building tree 9 of 1000
building tree 10 of 1000
building tree 11 of 1000
building tree 12 of 1000
building tree 13 of 1000
[Parallel(n_jobs=-1)]: Done   1 tasks      | elapsed:   20.9s
building tree 14 of 1000
building tree 15 of 1000
building tree 16 of 1000
building tree 17 of 1000
building tree 18 of 1000
building tree 19 of 1000
building tree 20 of 1000
[Parallel(n_jobs=-1)]: Done   8 tasks      | elapsed:   23.6s
building tree 21 of 1000
building tree 22 of 1000
building tree 23 of 1000
building tree 24 of 1000
building tree 25 of 1000
building tree 26 of 1000
building tree 27 of 1000
building tree 28 of 1000
building tree 29 of 1000
[Parallel(n_jobs=-1)]: Done  17 tasks      | elapsed:   44.0s
building tree 30 of 1000
building tree 31 of 1000
building tree 32 of 1000
building tree 33 of 1000
building tree 34 of 1000
building tree 35 of 1000
building tree 36 of 1000
building tree 37 of 1000
building tree 38 of 1000
[Parallel(n_jobs=-1)]: Done  26 tasks      | elapsed:  1.1min
building tree 39 of 1000
building tree 40 of 1000
building tree 41 of 1000
building tree 42 of 1000
building tree 43 of 1000
building tree 44 of 1000
building tree 45 of 1000
building tree 46 of 1000
building tree 47 of 1000
building tree 48 of 1000
[Parallel(n_jobs=-1)]: Done  37 tasks      | elapsed:  1.4min
building tree 49 of 1000
building tree 50 of 1000
building tree 51 of 1000
building tree 52 of 1000
building tree 53 of 1000
building tree 54 of 1000
building tree 55 of 1000
building tree 56 of 1000
building tree 57 of 1000
building tree 58 of 1000
building tree 59 of 1000
[Parallel(n_jobs=-1)]: Done  48 tasks      | elapsed:  1.6min
building tree 60 of 1000
building tree 61 of 1000
building tree 62 of 1000
building tree 63 of 1000
building tree 64 of 1000
building tree 65 of 1000
building tree 66 of 1000
building tree 67 of 1000
building tree 68 of 1000
building tree 69 of 1000
building tree 70 of 1000
building tree 71 of 1000
building tree 72 of 1000
[Parallel(n_jobs=-1)]: Done  61 tasks      | elapsed:  2.1min
building tree 73 of 1000
building tree 74 of 1000
building tree 75 of 1000
building tree 76 of 1000
building tree 77 of 1000
building tree 78 of 1000
building tree 79 of 1000
building tree 80 of 1000
building tree 81 of 1000
building tree 82 of 1000
building tree 83 of 1000
building tree 84 of 1000
building tree 85 of 1000
[Parallel(n_jobs=-1)]: Done  74 tasks      | elapsed:  2.5min
building tree 86 of 1000
building tree 87 of 1000
building tree 88 of 1000
building tree 89 of 1000
building tree 90 of 1000
building tree 91 of 1000
building tree 92 of 1000
building tree 93 of 1000
building tree 94 of 1000
building tree 95 of 1000
building tree 96 of 1000
building tree 97 of 1000
building tree 98 of 1000
building tree 99 of 1000
building tree 100 of 1000
building tree 101 of 1000
[Parallel(n_jobs=-1)]: Done  89 tasks      | elapsed:  3.0min
building tree 102 of 1000
building tree 103 of 1000
building tree 104 of 1000
building tree 105 of 1000
building tree 106 of 1000
building tree 107 of 1000
building tree 108 of 1000
building tree 109 of 1000
building tree 110 of 1000
building tree 111 of 1000
building tree 112 of 1000
building tree 113 of 1000
building tree 114 of 1000
building tree 115 of 1000
[Parallel(n_jobs=-1)]: Done 104 tasks      | elapsed:  3.5min
building tree 116 of 1000
building tree 117 of 1000
building tree 118 of 1000
building tree 119 of 1000
building tree 120 of 1000
building tree 121 of 1000
building tree 122 of 1000
building tree 123 of 1000
building tree 124 of 1000
building tree 125 of 1000
building tree 126 of 1000
building tree 127 of 1000
building tree 128 of 1000
building tree 129 of 1000
building tree 130 of 1000
building tree 131 of 1000
building tree 132 of 1000
[Parallel(n_jobs=-1)]: Done 121 tasks      | elapsed:  4.0min
building tree 133 of 1000
building tree 134 of 1000
building tree 135 of 1000
building tree 136 of 1000
building tree 137 of 1000
building tree 138 of 1000
building tree 139 of 1000
building tree 140 of 1000
building tree 141 of 1000
building tree 142 of 1000
building tree 143 of 1000
building tree 144 of 1000
building tree 145 of 1000
building tree 146 of 1000
building tree 147 of 1000
building tree 148 of 1000
building tree 149 of 1000
building tree 150 of 1000
[Parallel(n_jobs=-1)]: Done 138 tasks      | elapsed:  4.6min
building tree 151 of 1000
building tree 152 of 1000
building tree 153 of 1000
building tree 154 of 1000
building tree 155 of 1000
building tree 156 of 1000
building tree 157 of 1000
building tree 158 of 1000
building tree 159 of 1000
building tree 160 of 1000
building tree 161 of 1000
building tree 162 of 1000
building tree 163 of 1000
building tree 164 of 1000
building tree 165 of 1000
building tree 166 of 1000
building tree 167 of 1000
building tree 168 of 1000
[Parallel(n_jobs=-1)]: Done 157 tasks      | elapsed:  5.2min
building tree 169 of 1000
building tree 170 of 1000
building tree 171 of 1000
building tree 172 of 1000
building tree 173 of 1000
building tree 174 of 1000
building tree 175 of 1000
building tree 176 of 1000
building tree 177 of 1000
building tree 178 of 1000
building tree 179 of 1000
building tree 180 of 1000
building tree 181 of 1000
building tree 182 of 1000
building tree 183 of 1000
building tree 184 of 1000
building tree 185 of 1000
building tree 186 of 1000
building tree 187 of 1000
building tree 188 of 1000
[Parallel(n_jobs=-1)]: Done 176 tasks      | elapsed:  5.8min
building tree 189 of 1000
building tree 190 of 1000
building tree 191 of 1000
building tree 192 of 1000
building tree 193 of 1000
building tree 194 of 1000
building tree 195 of 1000
building tree 196 of 1000
building tree 197 of 1000
building tree 198 of 1000
building tree 199 of 1000
building tree 200 of 1000
building tree 201 of 1000
building tree 202 of 1000
building tree 203 of 1000
building tree 204 of 1000
building tree 205 of 1000
building tree 206 of 1000
building tree 207 of 1000
building tree 208 of 1000
[Parallel(n_jobs=-1)]: Done 197 tasks      | elapsed:  6.5min
building tree 209 of 1000
building tree 210 of 1000
building tree 211 of 1000
building tree 212 of 1000
building tree 213 of 1000
building tree 214 of 1000
building tree 215 of 1000
building tree 216 of 1000
building tree 217 of 1000
building tree 218 of 1000
building tree 219 of 1000
building tree 220 of 1000
building tree 221 of 1000
building tree 222 of 1000
building tree 223 of 1000
building tree 224 of 1000
building tree 225 of 1000
building tree 226 of 1000
building tree 227 of 1000
building tree 228 of 1000
building tree 229 of 1000
building tree 230 of 1000
[Parallel(n_jobs=-1)]: Done 218 tasks      | elapsed:  7.3min
building tree 231 of 1000
building tree 232 of 1000
building tree 233 of 1000
building tree 234 of 1000
building tree 235 of 1000
building tree 236 of 1000
building tree 237 of 1000
building tree 238 of 1000
building tree 239 of 1000
building tree 240 of 1000
building tree 241 of 1000
building tree 242 of 1000
building tree 243 of 1000
building tree 244 of 1000
building tree 245 of 1000
building tree 246 of 1000
building tree 247 of 1000
building tree 248 of 1000
building tree 249 of 1000
building tree 250 of 1000
building tree 251 of 1000
building tree 252 of 1000
[Parallel(n_jobs=-1)]: Done 241 tasks      | elapsed:  8.0min
building tree 253 of 1000
building tree 254 of 1000
building tree 255 of 1000
building tree 256 of 1000
building tree 257 of 1000
building tree 258 of 1000
building tree 259 of 1000
building tree 260 of 1000
building tree 261 of 1000
building tree 262 of 1000
building tree 263 of 1000
building tree 264 of 1000
building tree 265 of 1000
building tree 266 of 1000
building tree 267 of 1000
building tree 268 of 1000
building tree 269 of 1000
building tree 270 of 1000
building tree 271 of 1000
building tree 272 of 1000
building tree 273 of 1000
building tree 274 of 1000
building tree 275 of 1000
[Parallel(n_jobs=-1)]: Done 264 tasks      | elapsed:  8.7min
building tree 276 of 1000
building tree 277 of 1000
building tree 278 of 1000
building tree 279 of 1000
building tree 280 of 1000
building tree 281 of 1000
building tree 282 of 1000
building tree 283 of 1000
building tree 284 of 1000
building tree 285 of 1000
building tree 286 of 1000
building tree 287 of 1000
building tree 288 of 1000
building tree 289 of 1000
building tree 290 of 1000
building tree 291 of 1000
building tree 292 of 1000
building tree 293 of 1000
building tree 294 of 1000
building tree 295 of 1000
building tree 296 of 1000
building tree 297 of 1000
building tree 298 of 1000
building tree 299 of 1000
building tree 300 of 1000
[Parallel(n_jobs=-1)]: Done 289 tasks      | elapsed:  9.6min
building tree 301 of 1000
building tree 302 of 1000
building tree 303 of 1000
building tree 304 of 1000
building tree 305 of 1000
building tree 306 of 1000
building tree 307 of 1000
building tree 308 of 1000
building tree 309 of 1000
building tree 310 of 1000
building tree 311 of 1000
building tree 312 of 1000
building tree 313 of 1000
building tree 314 of 1000
building tree 315 of 1000
building tree 316 of 1000
building tree 317 of 1000
building tree 318 of 1000
building tree 319 of 1000
building tree 320 of 1000
building tree 321 of 1000
building tree 322 of 1000
building tree 323 of 1000
building tree 324 of 1000
building tree 325 of 1000
[Parallel(n_jobs=-1)]: Done 314 tasks      | elapsed: 10.4min
building tree 326 of 1000
building tree 327 of 1000
building tree 328 of 1000
building tree 329 of 1000
building tree 330 of 1000
building tree 331 of 1000
building tree 332 of 1000
building tree 333 of 1000
building tree 334 of 1000
building tree 335 of 1000
building tree 336 of 1000
building tree 337 of 1000
building tree 338 of 1000
building tree 339 of 1000
building tree 340 of 1000
building tree 341 of 1000
building tree 342 of 1000
building tree 343 of 1000
building tree 344 of 1000
building tree 345 of 1000
building tree 346 of 1000
building tree 347 of 1000
building tree 348 of 1000
building tree 349 of 1000
building tree 350 of 1000
building tree 351 of 1000
building tree 352 of 1000
[Parallel(n_jobs=-1)]: Done 341 tasks      | elapsed: 11.2min
building tree 353 of 1000
building tree 354 of 1000
building tree 355 of 1000
building tree 356 of 1000
building tree 357 of 1000
building tree 358 of 1000
building tree 359 of 1000
building tree 360 of 1000
building tree 361 of 1000
building tree 362 of 1000
building tree 363 of 1000
building tree 364 of 1000
building tree 365 of 1000
building tree 366 of 1000
building tree 367 of 1000
building tree 368 of 1000
building tree 369 of 1000
building tree 370 of 1000
building tree 371 of 1000
building tree 372 of 1000
building tree 373 of 1000
building tree 374 of 1000
building tree 375 of 1000
building tree 376 of 1000
building tree 377 of 1000
building tree 378 of 1000
building tree 379 of 1000
[Parallel(n_jobs=-1)]: Done 368 tasks      | elapsed: 12.0min
building tree 380 of 1000
building tree 381 of 1000
building tree 382 of 1000
building tree 383 of 1000
building tree 384 of 1000
building tree 385 of 1000
building tree 386 of 1000
building tree 387 of 1000
building tree 388 of 1000
building tree 389 of 1000
building tree 390 of 1000
building tree 391 of 1000
building tree 392 of 1000
building tree 393 of 1000
building tree 394 of 1000
building tree 395 of 1000
building tree 396 of 1000
building tree 397 of 1000
building tree 398 of 1000
building tree 399 of 1000
building tree 400 of 1000
building tree 401 of 1000
building tree 402 of 1000
building tree 403 of 1000
building tree 404 of 1000
building tree 405 of 1000
building tree 406 of 1000
building tree 407 of 1000
building tree 408 of 1000
[Parallel(n_jobs=-1)]: Done 397 tasks      | elapsed: 12.9min
building tree 409 of 1000
building tree 410 of 1000
building tree 411 of 1000
building tree 412 of 1000
building tree 413 of 1000
building tree 414 of 1000
building tree 415 of 1000
building tree 416 of 1000
building tree 417 of 1000
building tree 418 of 1000
building tree 419 of 1000
building tree 420 of 1000
building tree 421 of 1000
building tree 422 of 1000
building tree 423 of 1000
building tree 424 of 1000
building tree 425 of 1000
building tree 426 of 1000
building tree 427 of 1000
building tree 428 of 1000
building tree 429 of 1000
building tree 430 of 1000
building tree 431 of 1000
building tree 432 of 1000
building tree 433 of 1000
building tree 434 of 1000
building tree 435 of 1000
building tree 436 of 1000
building tree 437 of 1000
[Parallel(n_jobs=-1)]: Done 426 tasks      | elapsed: 13.8min
building tree 438 of 1000
building tree 439 of 1000
building tree 440 of 1000
building tree 441 of 1000
building tree 442 of 1000
building tree 443 of 1000
building tree 444 of 1000
building tree 445 of 1000
building tree 446 of 1000
building tree 447 of 1000
building tree 448 of 1000
building tree 449 of 1000
building tree 450 of 1000
building tree 451 of 1000
building tree 452 of 1000
building tree 453 of 1000
building tree 454 of 1000
building tree 455 of 1000
building tree 456 of 1000
building tree 457 of 1000
building tree 458 of 1000
building tree 459 of 1000
building tree 460 of 1000
building tree 461 of 1000
building tree 462 of 1000
building tree 463 of 1000
building tree 464 of 1000
building tree 465 of 1000
building tree 466 of 1000
building tree 467 of 1000
building tree 468 of 1000
[Parallel(n_jobs=-1)]: Done 457 tasks      | elapsed: 14.9min
building tree 469 of 1000
building tree 470 of 1000
building tree 471 of 1000
building tree 472 of 1000
building tree 473 of 1000
building tree 474 of 1000
building tree 475 of 1000
building tree 476 of 1000
building tree 477 of 1000
building tree 478 of 1000
building tree 479 of 1000
building tree 480 of 1000
building tree 481 of 1000
building tree 482 of 1000
building tree 483 of 1000
building tree 484 of 1000
building tree 485 of 1000
building tree 486 of 1000
building tree 487 of 1000
building tree 488 of 1000
building tree 489 of 1000
building tree 490 of 1000
building tree 491 of 1000
building tree 492 of 1000
building tree 493 of 1000
building tree 494 of 1000
building tree 495 of 1000
building tree 496 of 1000
building tree 497 of 1000
building tree 498 of 1000
building tree 499 of 1000
[Parallel(n_jobs=-1)]: Done 488 tasks      | elapsed: 15.9min
building tree 500 of 1000
building tree 501 of 1000
building tree 502 of 1000
building tree 503 of 1000
building tree 504 of 1000
building tree 505 of 1000
building tree 506 of 1000
building tree 507 of 1000
building tree 508 of 1000
building tree 509 of 1000
building tree 510 of 1000
building tree 511 of 1000
building tree 512 of 1000
building tree 513 of 1000
building tree 514 of 1000
building tree 515 of 1000
building tree 516 of 1000
building tree 517 of 1000
building tree 518 of 1000
building tree 519 of 1000
building tree 520 of 1000
building tree 521 of 1000
building tree 522 of 1000
building tree 523 of 1000
building tree 524 of 1000
building tree 525 of 1000
building tree 526 of 1000
building tree 527 of 1000
building tree 528 of 1000
building tree 529 of 1000
building tree 530 of 1000
building tree 531 of 1000
building tree 532 of 1000
[Parallel(n_jobs=-1)]: Done 521 tasks      | elapsed: 16.9min
building tree 533 of 1000
building tree 534 of 1000
building tree 535 of 1000
building tree 536 of 1000
building tree 537 of 1000
building tree 538 of 1000
building tree 539 of 1000
building tree 540 of 1000
building tree 541 of 1000
building tree 542 of 1000
building tree 543 of 1000
building tree 544 of 1000
building tree 545 of 1000
building tree 546 of 1000
building tree 547 of 1000
building tree 548 of 1000
building tree 549 of 1000
building tree 550 of 1000
building tree 551 of 1000
building tree 552 of 1000
building tree 553 of 1000
building tree 554 of 1000
building tree 555 of 1000
building tree 556 of 1000
building tree 557 of 1000
building tree 558 of 1000
building tree 559 of 1000
building tree 560 of 1000
building tree 561 of 1000
building tree 562 of 1000
building tree 563 of 1000
building tree 564 of 1000
building tree 565 of 1000
building tree 566 of 1000
[Parallel(n_jobs=-1)]: Done 554 tasks      | elapsed: 18.0min
building tree 567 of 1000
building tree 568 of 1000
building tree 569 of 1000
building tree 570 of 1000
building tree 571 of 1000
building tree 572 of 1000
building tree 573 of 1000
building tree 574 of 1000
building tree 575 of 1000
building tree 576 of 1000
building tree 577 of 1000
building tree 578 of 1000
building tree 579 of 1000
building tree 580 of 1000
building tree 581 of 1000
building tree 582 of 1000
building tree 583 of 1000
building tree 584 of 1000
building tree 585 of 1000
building tree 586 of 1000
building tree 587 of 1000
building tree 588 of 1000
building tree 589 of 1000
building tree 590 of 1000
building tree 591 of 1000
building tree 592 of 1000
building tree 593 of 1000
building tree 594 of 1000
building tree 595 of 1000
building tree 596 of 1000
building tree 597 of 1000
building tree 598 of 1000
building tree 599 of 1000
building tree 600 of 1000
[Parallel(n_jobs=-1)]: Done 589 tasks      | elapsed: 19.2min
building tree 601 of 1000
building tree 602 of 1000
building tree 603 of 1000
building tree 604 of 1000
building tree 605 of 1000
building tree 606 of 1000
building tree 607 of 1000
building tree 608 of 1000
building tree 609 of 1000
building tree 610 of 1000
building tree 611 of 1000
building tree 612 of 1000
building tree 613 of 1000
building tree 614 of 1000
building tree 615 of 1000
building tree 616 of 1000
building tree 617 of 1000
building tree 618 of 1000
building tree 619 of 1000
building tree 620 of 1000
building tree 621 of 1000
building tree 622 of 1000
building tree 623 of 1000
building tree 624 of 1000
building tree 625 of 1000
building tree 626 of 1000
building tree 627 of 1000
building tree 628 of 1000
building tree 629 of 1000
building tree 630 of 1000
building tree 631 of 1000
building tree 632 of 1000
building tree 633 of 1000
building tree 634 of 1000
building tree 635 of 1000
building tree 636 of 1000
building tree 637 of 1000
building tree 638 of 1000
[Parallel(n_jobs=-1)]: Done 624 tasks      | elapsed: 20.3min
building tree 639 of 1000
building tree 640 of 1000
building tree 641 of 1000
building tree 642 of 1000
building tree 643 of 1000
building tree 644 of 1000
building tree 645 of 1000
building tree 646 of 1000
building tree 647 of 1000
building tree 648 of 1000
building tree 649 of 1000
building tree 650 of 1000
building tree 651 of 1000
building tree 652 of 1000
building tree 653 of 1000
building tree 654 of 1000
building tree 655 of 1000
building tree 656 of 1000
building tree 657 of 1000
building tree 658 of 1000
building tree 659 of 1000
building tree 660 of 1000
building tree 661 of 1000
building tree 662 of 1000
building tree 663 of 1000
building tree 664 of 1000
building tree 665 of 1000
building tree 666 of 1000
building tree 667 of 1000
building tree 668 of 1000
building tree 669 of 1000
building tree 670 of 1000
building tree 671 of 1000
building tree 672 of 1000
[Parallel(n_jobs=-1)]: Done 661 tasks      | elapsed: 21.5min
building tree 673 of 1000
building tree 674 of 1000
building tree 675 of 1000
building tree 676 of 1000
building tree 677 of 1000
building tree 678 of 1000
building tree 679 of 1000
building tree 680 of 1000
building tree 681 of 1000
building tree 682 of 1000
building tree 683 of 1000
building tree 684 of 1000
building tree 685 of 1000
building tree 686 of 1000
building tree 687 of 1000
building tree 688 of 1000
building tree 689 of 1000
building tree 690 of 1000
building tree 691 of 1000
building tree 692 of 1000
building tree 693 of 1000
building tree 694 of 1000
building tree 695 of 1000
building tree 696 of 1000
building tree 697 of 1000
building tree 698 of 1000
building tree 699 of 1000
building tree 700 of 1000
building tree 701 of 1000
building tree 702 of 1000
building tree 703 of 1000
building tree 704 of 1000
building tree 705 of 1000
building tree 706 of 1000
building tree 707 of 1000
building tree 708 of 1000
building tree 709 of 1000
building tree 710 of 1000
[Parallel(n_jobs=-1)]: Done 698 tasks      | elapsed: 22.6min
building tree 711 of 1000
building tree 712 of 1000
building tree 713 of 1000
building tree 714 of 1000
building tree 715 of 1000
building tree 716 of 1000
building tree 717 of 1000
building tree 718 of 1000
building tree 719 of 1000
building tree 720 of 1000
building tree 721 of 1000
building tree 722 of 1000
building tree 723 of 1000
building tree 724 of 1000
building tree 725 of 1000
building tree 726 of 1000
building tree 727 of 1000
building tree 728 of 1000
building tree 729 of 1000
building tree 730 of 1000
building tree 731 of 1000
building tree 732 of 1000
building tree 733 of 1000
building tree 734 of 1000
building tree 735 of 1000
building tree 736 of 1000
building tree 737 of 1000
building tree 738 of 1000
building tree 739 of 1000
building tree 740 of 1000
building tree 741 of 1000
building tree 742 of 1000
building tree 743 of 1000
building tree 744 of 1000
building tree 745 of 1000
building tree 746 of 1000
building tree 747 of 1000
building tree 748 of 1000
building tree 749 of 1000
[Parallel(n_jobs=-1)]: Done 737 tasks      | elapsed: 23.9min
building tree 750 of 1000
building tree 751 of 1000
building tree 752 of 1000
building tree 753 of 1000
building tree 754 of 1000
building tree 755 of 1000
building tree 756 of 1000
building tree 757 of 1000
building tree 758 of 1000
building tree 759 of 1000
building tree 760 of 1000
building tree 761 of 1000
building tree 762 of 1000
building tree 763 of 1000
building tree 764 of 1000
building tree 765 of 1000
building tree 766 of 1000
building tree 767 of 1000
building tree 768 of 1000
building tree 769 of 1000
building tree 770 of 1000
building tree 771 of 1000
building tree 772 of 1000
building tree 773 of 1000
building tree 774 of 1000
building tree 775 of 1000
building tree 776 of 1000
building tree 777 of 1000
building tree 778 of 1000
building tree 779 of 1000
building tree 780 of 1000
building tree 781 of 1000
building tree 782 of 1000
building tree 783 of 1000
building tree 784 of 1000
building tree 785 of 1000
building tree 786 of 1000
building tree 787 of 1000
[Parallel(n_jobs=-1)]: Done 776 tasks      | elapsed: 25.1min
building tree 788 of 1000
building tree 789 of 1000
building tree 790 of 1000
building tree 791 of 1000
building tree 792 of 1000
building tree 793 of 1000
building tree 794 of 1000
building tree 795 of 1000
building tree 796 of 1000
building tree 797 of 1000
building tree 798 of 1000
building tree 799 of 1000
building tree 800 of 1000
building tree 801 of 1000
building tree 802 of 1000
building tree 803 of 1000
building tree 804 of 1000
building tree 805 of 1000
building tree 806 of 1000
building tree 807 of 1000
building tree 808 of 1000
building tree 809 of 1000
building tree 810 of 1000
building tree 811 of 1000
building tree 812 of 1000
building tree 813 of 1000
building tree 814 of 1000
building tree 815 of 1000
building tree 816 of 1000
building tree 817 of 1000
building tree 818 of 1000
building tree 819 of 1000
building tree 820 of 1000
building tree 821 of 1000
building tree 822 of 1000
building tree 823 of 1000
building tree 824 of 1000
building tree 825 of 1000
building tree 826 of 1000
building tree 827 of 1000
building tree 828 of 1000
[Parallel(n_jobs=-1)]: Done 817 tasks      | elapsed: 26.4min
building tree 829 of 1000
building tree 830 of 1000
building tree 831 of 1000
building tree 832 of 1000
building tree 833 of 1000
building tree 834 of 1000
building tree 835 of 1000
building tree 836 of 1000
building tree 837 of 1000
building tree 838 of 1000
building tree 839 of 1000
building tree 840 of 1000
building tree 841 of 1000
building tree 842 of 1000
building tree 843 of 1000
building tree 844 of 1000
building tree 845 of 1000
building tree 846 of 1000
building tree 847 of 1000
building tree 848 of 1000
building tree 849 of 1000
building tree 850 of 1000
building tree 851 of 1000
building tree 852 of 1000
building tree 853 of 1000
building tree 854 of 1000
building tree 855 of 1000
building tree 856 of 1000
building tree 857 of 1000
building tree 858 of 1000
building tree 859 of 1000
building tree 860 of 1000
building tree 861 of 1000
building tree 862 of 1000
building tree 863 of 1000
building tree 864 of 1000
building tree 865 of 1000
building tree 866 of 1000
building tree 867 of 1000
building tree 868 of 1000
building tree 869 of 1000
building tree 870 of 1000
building tree 871 of 1000
[Parallel(n_jobs=-1)]: Done 858 tasks      | elapsed: 27.7min
building tree 872 of 1000
building tree 873 of 1000
building tree 874 of 1000
building tree 875 of 1000
building tree 876 of 1000
building tree 877 of 1000
building tree 878 of 1000
building tree 879 of 1000
building tree 880 of 1000
building tree 881 of 1000
building tree 882 of 1000
building tree 883 of 1000
building tree 884 of 1000
building tree 885 of 1000
building tree 886 of 1000
building tree 887 of 1000
building tree 888 of 1000
building tree 889 of 1000
building tree 890 of 1000
building tree 891 of 1000
building tree 892 of 1000
building tree 893 of 1000
building tree 894 of 1000
building tree 895 of 1000
building tree 896 of 1000
building tree 897 of 1000
building tree 898 of 1000
building tree 899 of 1000
building tree 900 of 1000
building tree 901 of 1000
building tree 902 of 1000
building tree 903 of 1000
building tree 904 of 1000
building tree 905 of 1000
building tree 906 of 1000
building tree 907 of 1000
building tree 908 of 1000
building tree 909 of 1000
building tree 910 of 1000
building tree 911 of 1000
building tree 912 of 1000
[Parallel(n_jobs=-1)]: Done 901 tasks      | elapsed: 29.1min
building tree 913 of 1000
building tree 914 of 1000
building tree 915 of 1000
building tree 916 of 1000
building tree 917 of 1000
building tree 918 of 1000
building tree 919 of 1000
building tree 920 of 1000
building tree 921 of 1000
building tree 922 of 1000
building tree 923 of 1000
building tree 924 of 1000
building tree 925 of 1000
building tree 926 of 1000
building tree 927 of 1000
building tree 928 of 1000
building tree 929 of 1000
building tree 930 of 1000
building tree 931 of 1000
building tree 932 of 1000
building tree 933 of 1000
building tree 934 of 1000
building tree 935 of 1000
building tree 936 of 1000
building tree 937 of 1000
building tree 938 of 1000
building tree 939 of 1000
building tree 940 of 1000
building tree 941 of 1000
building tree 942 of 1000
building tree 943 of 1000
building tree 944 of 1000
building tree 945 of 1000
building tree 946 of 1000
building tree 947 of 1000
building tree 948 of 1000
building tree 949 of 1000
building tree 950 of 1000
building tree 951 of 1000
building tree 952 of 1000
building tree 953 of 1000
building tree 954 of 1000
building tree 955 of 1000
[Parallel(n_jobs=-1)]: Done 944 tasks      | elapsed: 30.5min
building tree 956 of 1000
building tree 957 of 1000
building tree 958 of 1000
building tree 959 of 1000
building tree 960 of 1000
building tree 961 of 1000
building tree 962 of 1000
building tree 963 of 1000
building tree 964 of 1000
building tree 965 of 1000
building tree 966 of 1000
building tree 967 of 1000
building tree 968 of 1000
building tree 969 of 1000
building tree 970 of 1000
building tree 971 of 1000
building tree 972 of 1000
building tree 973 of 1000
building tree 974 of 1000
building tree 975 of 1000
building tree 976 of 1000
building tree 977 of 1000
building tree 978 of 1000
building tree 979 of 1000
building tree 980 of 1000
building tree 981 of 1000
building tree 982 of 1000
building tree 983 of 1000
building tree 984 of 1000
building tree 985 of 1000
building tree 986 of 1000
building tree 987 of 1000
building tree 988 of 1000
building tree 989 of 1000
building tree 990 of 1000
building tree 991 of 1000
building tree 992 of 1000
building tree 993 of 1000
building tree 994 of 1000
building tree 995 of 1000
building tree 996 of 1000
building tree 997 of 1000
building tree 998 of 1000
building tree 999 of 1000
building tree 1000 of 1000
[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed: 32.5min finished
Out[20]:
RandomForestClassifier(bootstrap=False, class_weight=None, criterion='entropy',
                       max_depth=50, max_features='auto', max_leaf_nodes=None,
                       min_impurity_decrease=0.0, min_impurity_split=None,
                       min_samples_leaf=2, min_samples_split=10,
                       min_weight_fraction_leaf=0.0, n_estimators=1000,
                       n_jobs=-1, oob_score=False, random_state=None,
                       verbose=10, warm_start=False)
In [21]:
_jl.dump(best_model, "../models/best_Random_Forest_std.joblib")
Out[21]:
['../models/best_Random_Forest_std.joblib']
In [22]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set = test_set.drop(columns=[*_cols_bin, *_cols_real])
test_set.shape
Out[22]:
(153993, 753)
In [23]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
[Parallel(n_jobs=12)]: Using backend ThreadingBackend with 12 concurrent workers.
[Parallel(n_jobs=12)]: Done   1 tasks      | elapsed:    0.1s
[Parallel(n_jobs=12)]: Done   8 tasks      | elapsed:    0.2s
[Parallel(n_jobs=12)]: Done  17 tasks      | elapsed:    0.4s
[Parallel(n_jobs=12)]: Done  26 tasks      | elapsed:    0.6s
[Parallel(n_jobs=12)]: Done  37 tasks      | elapsed:    0.9s
[Parallel(n_jobs=12)]: Done  48 tasks      | elapsed:    1.0s
[Parallel(n_jobs=12)]: Done  61 tasks      | elapsed:    1.5s
[Parallel(n_jobs=12)]: Done  74 tasks      | elapsed:    1.7s
[Parallel(n_jobs=12)]: Done  89 tasks      | elapsed:    2.0s
[Parallel(n_jobs=12)]: Done 104 tasks      | elapsed:    2.4s
[Parallel(n_jobs=12)]: Done 121 tasks      | elapsed:    2.8s
[Parallel(n_jobs=12)]: Done 138 tasks      | elapsed:    3.2s
[Parallel(n_jobs=12)]: Done 157 tasks      | elapsed:    3.6s
[Parallel(n_jobs=12)]: Done 176 tasks      | elapsed:    4.1s
[Parallel(n_jobs=12)]: Done 197 tasks      | elapsed:    4.6s
[Parallel(n_jobs=12)]: Done 218 tasks      | elapsed:    5.1s
[Parallel(n_jobs=12)]: Done 241 tasks      | elapsed:    5.7s
[Parallel(n_jobs=12)]: Done 264 tasks      | elapsed:    6.2s
[Parallel(n_jobs=12)]: Done 289 tasks      | elapsed:    6.8s
[Parallel(n_jobs=12)]: Done 314 tasks      | elapsed:    7.4s
[Parallel(n_jobs=12)]: Done 341 tasks      | elapsed:    8.0s
[Parallel(n_jobs=12)]: Done 368 tasks      | elapsed:    8.6s
[Parallel(n_jobs=12)]: Done 397 tasks      | elapsed:    9.2s
[Parallel(n_jobs=12)]: Done 426 tasks      | elapsed:   10.0s
[Parallel(n_jobs=12)]: Done 457 tasks      | elapsed:   10.6s
[Parallel(n_jobs=12)]: Done 488 tasks      | elapsed:   11.4s
[Parallel(n_jobs=12)]: Done 521 tasks      | elapsed:   12.2s
[Parallel(n_jobs=12)]: Done 554 tasks      | elapsed:   13.0s
[Parallel(n_jobs=12)]: Done 589 tasks      | elapsed:   13.8s
[Parallel(n_jobs=12)]: Done 624 tasks      | elapsed:   14.7s
[Parallel(n_jobs=12)]: Done 661 tasks      | elapsed:   15.6s
[Parallel(n_jobs=12)]: Done 698 tasks      | elapsed:   16.5s
[Parallel(n_jobs=12)]: Done 737 tasks      | elapsed:   17.4s
[Parallel(n_jobs=12)]: Done 776 tasks      | elapsed:   18.3s
[Parallel(n_jobs=12)]: Done 817 tasks      | elapsed:   19.3s
[Parallel(n_jobs=12)]: Done 858 tasks      | elapsed:   20.2s
[Parallel(n_jobs=12)]: Done 901 tasks      | elapsed:   21.2s
[Parallel(n_jobs=12)]: Done 944 tasks      | elapsed:   22.2s
predictions:
 [0 0 1 ... 1 1 0]
[Parallel(n_jobs=12)]: Done 1000 out of 1000 | elapsed:   23.5s finished
In [24]:
set(predic)
Out[24]:
{0, 1}
In [25]:
# evaluate classifier

print("Report for Random Forest classifier:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Random Forest Classifier:", _accuracy_score(test_set['likes'], predic)*100)
Report for Random Forest classifier:
              precision    recall  f1-score   support

           0       0.69      0.40      0.51     50930
           1       0.75      0.91      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.72      0.66      0.67    153993
weighted avg       0.73      0.74      0.72    153993

Accuracy for Random Forest Classifier: 74.30337742624666
In [26]:
# Confusion matrix for Random Forest

print("Confusion Matrix for Random Forest: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for Random Forest: 
Out[26]:
array([[20322, 30608],
       [ 8963, 94100]], dtype=int64)
In [27]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("Random Forest ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [28]:
_del_all()

7.9 Neural Network only standard columns

In [29]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.shape
Out[29]:
(558386, 787)
In [30]:
train_set = train_set.drop(columns=[*_cols_bin, *_cols_real])
train_set.shape
Out[30]:
(558386, 753)
In [31]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 7

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
107
In [32]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [33]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 34s 86us/step - loss: 0.5951 - acc: 0.6957 - val_loss: 0.5575 - val_acc: 0.7210
Epoch 2/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5558 - acc: 0.7236 - val_loss: 0.5577 - val_acc: 0.7252
Epoch 3/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5467 - acc: 0.7305 - val_loss: 0.5468 - val_acc: 0.7331
Epoch 4/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5421 - acc: 0.7342 - val_loss: 0.5335 - val_acc: 0.7436
Epoch 5/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5401 - acc: 0.7360 - val_loss: 0.5426 - val_acc: 0.7396
Epoch 6/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5375 - acc: 0.7379 - val_loss: 0.5408 - val_acc: 0.7385
Epoch 7/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5345 - acc: 0.7396 - val_loss: 0.5380 - val_acc: 0.7406
Epoch 8/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5344 - acc: 0.7400 - val_loss: 0.5378 - val_acc: 0.7397
Epoch 9/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5331 - acc: 0.7408 - val_loss: 0.5368 - val_acc: 0.7408
Epoch 10/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5311 - acc: 0.7417 - val_loss: 0.5317 - val_acc: 0.7421
Epoch 11/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5297 - acc: 0.7425 - val_loss: 0.5350 - val_acc: 0.7427
Epoch 12/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5293 - acc: 0.7432 - val_loss: 0.5306 - val_acc: 0.7439
Epoch 13/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5284 - acc: 0.7442 - val_loss: 0.5353 - val_acc: 0.7400
Epoch 14/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5264 - acc: 0.7447 - val_loss: 0.5302 - val_acc: 0.7443
Epoch 15/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5266 - acc: 0.7448 - val_loss: 0.5313 - val_acc: 0.7454
Epoch 16/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5255 - acc: 0.7454 - val_loss: 0.5265 - val_acc: 0.7478
Epoch 17/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5247 - acc: 0.7457 - val_loss: 0.5323 - val_acc: 0.7428
Epoch 18/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5243 - acc: 0.7459 - val_loss: 0.5336 - val_acc: 0.7459
Epoch 19/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5238 - acc: 0.7465 - val_loss: 0.5264 - val_acc: 0.7475
Epoch 20/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5227 - acc: 0.7468 - val_loss: 0.5310 - val_acc: 0.7445
Epoch 21/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5230 - acc: 0.7469 - val_loss: 0.5320 - val_acc: 0.7427
Epoch 22/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5229 - acc: 0.7466 - val_loss: 0.5262 - val_acc: 0.7453
Epoch 23/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5214 - acc: 0.7479 - val_loss: 0.5291 - val_acc: 0.7446
Epoch 24/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5208 - acc: 0.7480 - val_loss: 0.5219 - val_acc: 0.7507
Epoch 25/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5203 - acc: 0.7487 - val_loss: 0.5378 - val_acc: 0.7360
Epoch 26/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5201 - acc: 0.7483 - val_loss: 0.5325 - val_acc: 0.7439
Epoch 27/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5196 - acc: 0.7490 - val_loss: 0.5261 - val_acc: 0.7464
Epoch 28/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5195 - acc: 0.7486 - val_loss: 0.5202 - val_acc: 0.7501
Epoch 29/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5198 - acc: 0.7488 - val_loss: 0.5267 - val_acc: 0.7473
Epoch 30/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5186 - acc: 0.7497 - val_loss: 0.5273 - val_acc: 0.7444
Epoch 31/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5187 - acc: 0.7495 - val_loss: 0.5261 - val_acc: 0.7458
Epoch 32/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5179 - acc: 0.7494 - val_loss: 0.5359 - val_acc: 0.7392
Epoch 33/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5193 - acc: 0.7497 - val_loss: 0.5280 - val_acc: 0.7448
Epoch 34/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5172 - acc: 0.7500 - val_loss: 0.5214 - val_acc: 0.7509
Epoch 35/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5173 - acc: 0.7497 - val_loss: 0.5247 - val_acc: 0.7478
Epoch 36/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5171 - acc: 0.7500 - val_loss: 0.5184 - val_acc: 0.7517
Epoch 37/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5174 - acc: 0.7498 - val_loss: 0.5223 - val_acc: 0.7500
Epoch 38/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5170 - acc: 0.7498 - val_loss: 0.5207 - val_acc: 0.7491
Epoch 39/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5165 - acc: 0.7502 - val_loss: 0.5375 - val_acc: 0.7415
Epoch 40/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5156 - acc: 0.7508 - val_loss: 0.5212 - val_acc: 0.7486
Epoch 41/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5153 - acc: 0.7510 - val_loss: 0.5206 - val_acc: 0.7479
Epoch 42/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5151 - acc: 0.7517 - val_loss: 0.5196 - val_acc: 0.7513
Epoch 43/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5152 - acc: 0.7513 - val_loss: 0.5193 - val_acc: 0.7509
Epoch 44/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5149 - acc: 0.7517 - val_loss: 0.5192 - val_acc: 0.7515
Epoch 45/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5146 - acc: 0.7518 - val_loss: 0.5285 - val_acc: 0.7459
Epoch 46/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5149 - acc: 0.7514 - val_loss: 0.5183 - val_acc: 0.7517
Epoch 47/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5148 - acc: 0.7514 - val_loss: 0.5197 - val_acc: 0.7514
Epoch 48/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5144 - acc: 0.7516 - val_loss: 0.5254 - val_acc: 0.7486
Epoch 49/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5141 - acc: 0.7517 - val_loss: 0.5229 - val_acc: 0.7511
Epoch 50/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5134 - acc: 0.7523 - val_loss: 0.5231 - val_acc: 0.7497
Epoch 51/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5128 - acc: 0.7528 - val_loss: 0.5192 - val_acc: 0.7512
Epoch 52/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5132 - acc: 0.7522 - val_loss: 0.5211 - val_acc: 0.7492
Epoch 53/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5159 - acc: 0.7510 - val_loss: 0.5260 - val_acc: 0.7476
Epoch 54/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5197 - acc: 0.7466 - val_loss: 0.5467 - val_acc: 0.7242
Epoch 55/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5188 - acc: 0.7477 - val_loss: 0.5220 - val_acc: 0.7509
Epoch 56/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5135 - acc: 0.7520 - val_loss: 0.5221 - val_acc: 0.7520
Epoch 57/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5125 - acc: 0.7535 - val_loss: 0.5214 - val_acc: 0.7504
Epoch 58/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5123 - acc: 0.7530 - val_loss: 0.5221 - val_acc: 0.7510
Epoch 59/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5116 - acc: 0.7529 - val_loss: 0.5278 - val_acc: 0.7474
Epoch 60/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5116 - acc: 0.7530 - val_loss: 0.5270 - val_acc: 0.7475
Epoch 61/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5112 - acc: 0.7533 - val_loss: 0.5216 - val_acc: 0.7497
Epoch 62/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5111 - acc: 0.7535 - val_loss: 0.5195 - val_acc: 0.7503
Epoch 63/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5112 - acc: 0.7531 - val_loss: 0.5194 - val_acc: 0.7516
Epoch 64/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5108 - acc: 0.7534 - val_loss: 0.5210 - val_acc: 0.7499
Epoch 65/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5109 - acc: 0.7537 - val_loss: 0.5206 - val_acc: 0.7512
Epoch 66/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5127 - acc: 0.7526 - val_loss: 0.5201 - val_acc: 0.7504
Epoch 67/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5106 - acc: 0.7537 - val_loss: 0.5275 - val_acc: 0.7466
Epoch 68/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5103 - acc: 0.7539 - val_loss: 0.5198 - val_acc: 0.7493
Epoch 69/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5118 - acc: 0.7537 - val_loss: 0.5189 - val_acc: 0.7532
Epoch 70/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5104 - acc: 0.7542 - val_loss: 0.5242 - val_acc: 0.7473
Epoch 71/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5100 - acc: 0.7541 - val_loss: 0.5208 - val_acc: 0.7516
Epoch 72/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5100 - acc: 0.7536 - val_loss: 0.5187 - val_acc: 0.7512
Epoch 73/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5107 - acc: 0.7538 - val_loss: 0.5194 - val_acc: 0.7506
Epoch 74/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5095 - acc: 0.7543 - val_loss: 0.5203 - val_acc: 0.7520
Epoch 75/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5096 - acc: 0.7543 - val_loss: 0.5243 - val_acc: 0.7492
Epoch 76/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5105 - acc: 0.7543 - val_loss: 0.5231 - val_acc: 0.7486
Epoch 77/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5092 - acc: 0.7544 - val_loss: 0.5211 - val_acc: 0.7492
Epoch 78/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5092 - acc: 0.7545 - val_loss: 0.5191 - val_acc: 0.7538
Epoch 79/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5087 - acc: 0.7550 - val_loss: 0.5187 - val_acc: 0.7520
Epoch 80/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5088 - acc: 0.7550 - val_loss: 0.5203 - val_acc: 0.7517
Epoch 81/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5090 - acc: 0.7547 - val_loss: 0.5209 - val_acc: 0.7491
Epoch 82/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5084 - acc: 0.7551 - val_loss: 0.5226 - val_acc: 0.7492
Epoch 83/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5095 - acc: 0.7547 - val_loss: 0.5281 - val_acc: 0.7500
Epoch 84/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5089 - acc: 0.7548 - val_loss: 0.5309 - val_acc: 0.7432
Epoch 85/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5100 - acc: 0.7535 - val_loss: 0.5200 - val_acc: 0.7509
Epoch 86/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5088 - acc: 0.7552 - val_loss: 0.5185 - val_acc: 0.7534
Epoch 87/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5087 - acc: 0.7549 - val_loss: 0.5182 - val_acc: 0.7525
Epoch 88/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5081 - acc: 0.7551 - val_loss: 0.5224 - val_acc: 0.7507
Epoch 89/100
390870/390870 [==============================] - 25s 64us/step - loss: 0.5091 - acc: 0.7548 - val_loss: 0.5283 - val_acc: 0.7477
Epoch 90/100
390870/390870 [==============================] - 25s 65us/step - loss: 0.5080 - acc: 0.7553 - val_loss: 0.5197 - val_acc: 0.7509
Epoch 91/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5077 - acc: 0.7556 - val_loss: 0.5179 - val_acc: 0.7522
Epoch 92/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5075 - acc: 0.7556 - val_loss: 0.5288 - val_acc: 0.7469
Epoch 93/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5087 - acc: 0.7553 - val_loss: 0.5216 - val_acc: 0.7517
Epoch 94/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5096 - acc: 0.7551 - val_loss: 0.5242 - val_acc: 0.7483
Epoch 95/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5096 - acc: 0.7551 - val_loss: 0.5217 - val_acc: 0.7505
Epoch 96/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5102 - acc: 0.7536 - val_loss: 0.5292 - val_acc: 0.7473
Epoch 97/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5079 - acc: 0.7557 - val_loss: 0.5224 - val_acc: 0.7502
Epoch 98/100
390870/390870 [==============================] - 26s 65us/step - loss: 0.5079 - acc: 0.7552 - val_loss: 0.5220 - val_acc: 0.7524
Epoch 99/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5069 - acc: 0.7559 - val_loss: 0.5239 - val_acc: 0.7496
Epoch 100/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5140 - acc: 0.7515 - val_loss: 0.5198 - val_acc: 0.7499
Out[33]:
<keras.callbacks.History at 0x1b708c73eb8>
In [34]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_std.h5")
In [35]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 17s 31us/step
[0.5116986021793278, 0.7529630040865652]
In [36]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set = test_set.drop(columns=[*_cols_bin, *_cols_real])
test_set.shape
Out[36]:
(153993, 753)
In [37]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[37]:
array([[1.],
       [0.],
       [0.],
       ...,
       [0.],
       [1.],
       [0.]], dtype=float32)
In [38]:
set(binary_prediction[:,0])
Out[38]:
{0.0, 1.0}
In [39]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.72      0.36      0.48     50930
           1       0.75      0.93      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.73      0.64      0.65    153993
weighted avg       0.74      0.74      0.71    153993

Accuracy for Deep Learning approach: 74.13713610358913
In [40]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[18245 32685]
 [ 7142 95921]]
In [41]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [42]:
_del_all()

7.10 SVM only real columns

In [43]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.shape
Out[43]:
(558386, 787)
In [44]:
train_set = train_set.drop(columns=[*_cols_bin, *_cols_std])
train_set.shape
Out[44]:
(558386, 754)
In [45]:
best_model = _jl.load("../models/best_SVM.joblib")
best_model.set_params(verbose=10)
best_model.get_params()
Out[45]:
{'C': 0.001,
 'class_weight': None,
 'dual': True,
 'fit_intercept': True,
 'intercept_scaling': 1,
 'loss': 'squared_hinge',
 'max_iter': 50000,
 'multi_class': 'ovr',
 'penalty': 'l2',
 'random_state': 0,
 'tol': 0.0001,
 'verbose': 10}
In [46]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
[LibLinear]
C:\Users\super\Anaconda3\envs\tf-gpu\lib\site-packages\sklearn\svm\base.py:929: ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.
  "the number of iterations.", ConvergenceWarning)
Out[46]:
LinearSVC(C=0.001, class_weight=None, dual=True, fit_intercept=True,
          intercept_scaling=1, loss='squared_hinge', max_iter=50000,
          multi_class='ovr', penalty='l2', random_state=0, tol=0.0001,
          verbose=10)
In [47]:
_jl.dump(best_model, "../models/best_SVM_real.joblib")
Out[47]:
['../models/best_SVM_real.joblib']
In [48]:
print("coef:", best_model.coef_)
print("intercept:", best_model.intercept_)
coef: [[-1.43716297e-01 -2.04535344e-01  3.42412756e-01 -1.70106570e-01
   9.42906076e-02 -7.64244164e-02  2.83228461e-05 -3.44367883e-02
   7.43154614e-03 -2.10748750e-05  6.42141961e-01 -7.67363295e-05
   1.05147224e-03  2.61865749e-04 -7.96263673e-04 -5.67863277e-04
  -6.66914030e-04  1.09450014e-03 -1.87557776e-04 -8.91209472e-05
   4.19307492e-04 -4.27875482e-05  8.48324927e-05 -8.68450895e-06
  -2.98160509e-04 -1.31943690e-01  4.98907004e-05 -4.09473030e-02
   2.69595801e-04  1.66887312e-02 -4.39124340e-04 -2.50972181e-04
   4.38847068e-05  1.61532968e-04 -9.42543009e-05  6.55236412e-02
  -6.76947062e-04  3.02737339e-02  2.37219033e-02  3.53476493e-02
   4.01342657e-02  5.32870518e-02  4.85615621e-02  1.44103811e-02
   2.56643390e-02  2.03262701e-02  4.17940324e-02 -1.24827229e-01
  -9.21161254e-02 -1.28367851e-01 -7.71161849e-02 -1.25495132e-01
  -1.42699889e-01 -1.26632389e-01 -1.07560837e-01 -1.11117980e-01
  -1.19877143e-01 -1.14195865e-01 -1.11238198e-01 -8.52179200e-02
  -8.51377622e-02 -9.34756944e-02 -8.14798291e-02 -8.91023815e-02
  -9.11799522e-02 -8.48363095e-02 -8.01925625e-02  6.83784341e-03
   1.92254971e-02  1.40058851e-02  2.39106028e-02  2.46003929e-02
  -2.27673372e-02  1.29762647e-02  3.11092757e-02 -5.74496098e-02
   4.47713432e-02  2.02799171e-02 -7.57741485e-02 -1.44389973e-02
   3.17057017e-02  2.86103761e-02  6.51142500e-02 -8.49139531e-03
  -7.77972563e-02 -1.30119107e-02 -3.40256582e-02 -1.44796203e-03
  -1.38423057e-03  1.31105830e-02  2.60418460e-02  2.33846729e-03
   5.29074929e-03  3.48008783e-01 -9.74783977e-03  1.77673098e-02
  -8.86489053e-03  1.80711141e-02  1.57005941e-02  4.03685529e-02
  -1.99766603e-02  2.17479883e-02 -1.81747456e-01  1.99719398e-02
   5.82290558e-02 -2.43132901e-02 -1.15618867e-01  8.72118852e-04
  -8.60105516e-02  2.41831332e-02 -2.10875723e-02 -5.56974893e-02
   1.37147410e-02  7.75777631e-03  1.20550494e-02 -8.02849787e-03
   1.97158718e-02 -6.76034995e-02  1.76035986e-02  1.75296613e-02
   7.07208171e-03  3.34970610e-02 -1.20573798e-01  8.31427170e-03
  -1.02979836e-01 -2.38829799e-02  1.64392594e-03 -7.94164576e-03
  -1.32036699e-02  6.66926944e-02  2.11093298e-02  2.92684975e-02
   1.99726243e-02 -9.68394996e-02  2.19072433e-02 -3.34938244e-02
  -7.81002948e-02  1.36871377e-02 -4.44362689e-02  3.37312734e-02
  -1.78446481e-02  6.16230824e-02  9.12907609e-02  1.73681006e-02
   9.42710477e-03  1.21382260e-02 -1.07187657e-02  3.71220243e-03
  -6.58497330e-03  1.37964168e-02  1.29674903e-01 -1.74803903e-02
  -1.57516188e-02 -1.36993620e-01 -9.85035233e-03 -2.89222187e-02
   7.58170440e-03  4.36057261e-02  6.97113801e-02 -2.81749802e-02
   2.09981262e-02  1.16701922e-01 -2.16218691e-02 -3.76903594e-02
   2.89249867e-02 -2.17925185e-02 -3.30512837e-02 -3.79864486e-03
   5.27191509e-02 -1.21867205e-02 -2.28182341e-02 -8.33960295e-02
   1.41297582e-02 -2.23504605e-02 -1.32998285e-02  7.90823855e-02
  -1.83089317e-02  2.10069237e-02  4.21856531e-03 -8.23935593e-02
   3.15873915e-02 -1.00053202e-02 -6.13077348e-02  1.35207139e-02
  -7.91516814e-03 -5.59031582e-03 -5.71692743e-03 -3.72606011e-03
  -2.37534039e-02  6.98443308e-02 -1.74441860e-02 -8.53985002e-03
   1.71990647e-02 -1.10963279e-01 -1.13059523e-02  1.18665664e-02
   4.83008993e-03 -1.30285498e-02 -8.77128617e-02  9.83830543e-03
   1.84587386e-02 -4.81211451e-02 -1.50782933e-02 -2.94999297e-04
  -2.95794368e-02  4.77008288e-02 -1.60578875e-02  4.29225749e-02
   1.56618451e-02 -6.85062777e-03  3.59512189e-02  3.35828023e-02
  -8.54411962e-03 -1.37198962e-02  4.75972657e-02  1.17362096e-02
  -5.43824915e-02 -3.19341547e-02 -3.71196861e-02 -3.30255707e-02
  -8.67861270e-02  6.77269315e-03 -6.64047346e-03 -3.21719005e-03
  -1.98840205e-03  2.39095825e-02  5.76495663e-03 -4.35348076e-02
  -1.18309881e-02  6.97192193e-02  1.69856151e-02  2.50381925e-02
  -5.02032740e-02 -4.71014503e-02  7.79945547e-03 -2.66495190e-02
   5.29165693e-03  3.96304236e-03  8.36005323e-03 -1.80505866e-02
   2.92364047e-02 -8.79619351e-03 -5.24025156e-02 -8.38200860e-03
   4.44280946e-03  9.30483463e-03 -3.83530258e-02 -3.02118791e-02
  -1.54155851e-02 -5.90653519e-04 -5.12051389e-02  8.42693854e-03
   1.65402105e-02  1.57442789e-02  1.31571247e-02  2.07821529e-02
  -1.17494214e-03 -5.28376198e-03 -1.99382278e-02 -2.47230047e-03
  -5.39078174e-03  5.40381038e-03 -6.14568668e-03 -4.77322227e-02
  -8.52581213e-04 -3.17905886e-03  6.64875505e-03 -1.07021403e-02
  -6.19146130e-02 -6.37739266e-03  4.45299066e-02 -1.15002674e-02
   4.38282778e-03 -6.36030533e-02  1.20311051e-02  2.59641550e-03
   3.86870102e-03  2.42142186e-02  4.84045567e-03 -5.89914590e-03
  -2.75915844e-02 -5.10269072e-05  3.28629430e-04  1.33020475e-02
   1.68775734e-02 -1.12454827e-02 -4.82852523e-02 -4.44387073e-04
   1.95298920e-02  4.51779966e-03  2.45641205e-02 -4.67522475e-03
   1.43765356e-02 -2.70234675e-03 -5.00163882e-02  6.22747213e-03
  -2.08606834e-02  3.05999934e-02 -3.98381028e-02 -2.47981151e-02
  -7.95026123e-05  1.09004063e-02  2.09445287e-02 -2.12375146e-02
  -2.31301644e-03 -9.02606034e-02 -9.50105319e-02  5.42986944e-03
   4.44964730e-03 -4.18983184e-03  2.81625116e-02 -1.25999766e-02
  -2.41662714e-03  2.55381467e-03 -4.06825386e-02  6.15082758e-03
  -1.68552411e-02  3.95541394e-02  1.04655343e-02  4.83088682e-02
  -1.43646018e-02  1.85676225e-02  4.80610771e-02  5.83198667e-03
   1.88030789e-02 -9.92599369e-03  1.92759739e-02  9.68732093e-03
   3.60602287e-03  2.11169130e-03  7.12049488e-03  5.77638547e-02
   1.48541561e-02  1.54975082e-03  2.00805555e-03  1.74872561e-02
  -2.20886530e-02  2.30314189e-02  1.32497217e-02 -5.49148428e-03
   3.53915515e-03  8.84630798e-03  3.94741276e-03 -2.96144230e-04
  -2.62217011e-02  1.46662615e-02  4.43793543e-03  8.37307760e-03
  -8.88604631e-03 -1.46383121e-02 -4.70885434e-02 -1.08056738e-02
   2.63550904e-02  4.25842720e-02 -1.06471586e-02 -3.20178407e-03
  -4.00402304e-02  7.74943513e-03  1.30340136e-02 -5.33402478e-03
   1.63923629e-02 -5.74304402e-04  4.87202627e-03  3.33034034e-02
   1.86518308e-02  8.05365248e-03  2.16511024e-02  1.48769671e-04
   2.51305937e-02 -1.58727100e-02  4.76463924e-03 -5.93707391e-02
  -8.57548415e-03 -2.15737399e-03 -1.51769019e-02 -1.06675964e-02
   9.10391321e-03  8.09320421e-03 -1.00892549e-02 -1.21250581e-02
   1.94932467e-02 -1.88602889e-02 -1.77840807e-02 -4.17600384e-02
   3.41119549e-04 -1.50481753e-03 -3.93696545e-03  8.32069882e-04
  -1.00086792e-02 -3.08954332e-02  1.50102907e-02 -2.02094891e-02
   1.58622884e-02 -3.27257295e-02 -2.90866697e-02  2.49579263e-02
  -2.65487658e-02  3.68630700e-02  4.06325061e-02 -1.15374121e-02
  -8.61179632e-03  3.32551687e-03 -4.52654593e-02 -1.36332587e-02
   1.22906007e-02 -1.88976446e-02  1.27437107e-02  6.49445940e-02
  -9.66161494e-03  2.15711752e-02 -3.45778686e-02  5.80762743e-04
  -2.13839479e-04  4.34493773e-03  9.90380184e-04  1.08717932e-02
   2.97467351e-02 -5.94337775e-03 -2.60234637e-02 -7.95258176e-03
  -6.48299816e-03  1.95546414e-02  1.67365683e-02 -1.14618509e-02
  -3.58714600e-03 -2.54777520e-03 -2.84852661e-02  1.49303319e-03
   4.95068791e-02  1.60731206e-02 -1.74493711e-02 -1.81267180e-02
   2.36541572e-02  1.06580281e-02  1.65311243e-02 -1.61455394e-03
   8.72829086e-03  3.86870102e-03 -2.03087109e-03 -2.67693393e-03
   5.60706191e-04  6.02260553e-03 -2.39887876e-02  2.34089987e-02
   5.32528475e-03 -4.45201832e-03  1.46990462e-02 -5.90900136e-02
  -1.53664685e-02  2.64979387e-02 -3.49239256e-02 -3.38094692e-02
   7.92953814e-03  7.47241074e-02 -6.75645710e-03  2.06018288e-02
   1.29008264e-02  2.63497889e-02  3.09334173e-02  7.84939100e-02
   1.01068985e-02  6.25077315e-03  1.20308144e-02  7.39519295e-03
   1.24135458e-03  7.70286332e-04 -5.43830545e-03 -6.38783462e-03
  -1.35901699e-02  2.67742151e-02 -2.40821850e-02  1.82810151e-02
   1.24718095e-02 -7.62080333e-03  4.03348586e-02 -1.71184613e-03
  -2.70636989e-02  1.19119881e-02 -1.64302434e-03  2.00077971e-02
  -1.40176110e-02 -8.05835669e-03  9.67086176e-03 -5.18628634e-03
   1.93308559e-02 -2.89383369e-02 -4.29723599e-02  5.37350245e-02
   3.16453339e-03 -2.64130760e-02  2.77979932e-02 -3.14515490e-02
   1.84017732e-03 -2.73363357e-02  2.61905678e-02 -2.08372057e-02
   2.83759060e-03  1.82699625e-02 -1.45091797e-02 -7.85236472e-05
   7.31865718e-03 -4.21628925e-03  2.07016491e-02 -3.85148502e-02
  -1.89099266e-02 -8.58946052e-03  1.67151061e-02  3.49526992e-02
   4.18637313e-03 -3.59994317e-02  2.45687128e-02 -3.81881299e-03
  -1.06850579e-02 -5.22586771e-02  4.26735946e-02 -2.35013806e-03
   2.33494510e-02  9.71528304e-03 -1.07773945e-02  2.52467523e-03
  -3.04210778e-02 -1.73296552e-03 -4.49178273e-02  2.49230145e-02
   3.92718681e-03  1.22880860e-02  2.34955451e-02  2.65367107e-02
   1.44125852e-02 -3.69221456e-04  1.82773207e-02 -1.17576018e-02
   2.52757927e-02 -4.46373382e-02 -3.57136625e-04 -2.60679490e-05
  -1.57908492e-02 -3.45646799e-02  2.78523427e-02  7.07094907e-03
   4.48861252e-03 -3.83012103e-02  1.55599056e-02  3.70129306e-03
   1.15512975e-02 -9.01585174e-03  5.39040951e-05  1.66301005e-03
   5.38590657e-03 -1.92638044e-02 -2.24117148e-03 -2.78016044e-04
  -3.90414502e-02 -3.12567861e-02 -8.96096477e-04 -2.78328104e-02
   1.06643622e-02 -2.27281534e-03  3.16658561e-02 -3.20813228e-02
   6.39288622e-03  0.00000000e+00 -1.45071139e-02  1.42715026e-03
  -7.97438314e-03  3.84988264e-03  2.68315418e-03 -2.65674288e-02
  -4.05858718e-03  6.05797036e-03  7.80432532e-03 -1.91687863e-02
   6.11712672e-03  9.93745103e-03 -9.50309878e-04 -4.17014796e-02
  -1.06940936e-02 -5.59727389e-03 -2.88390609e-03 -3.31251482e-02
   7.56702389e-03  2.21977742e-02  6.03846263e-03  2.27310460e-02
  -3.58590645e-02 -6.25894466e-03  3.03627124e-02 -1.31549344e-02
   2.86701270e-02 -4.38543972e-03  5.96440014e-03  3.65200851e-02
  -3.44983708e-02 -3.54270601e-04 -1.17295962e-02 -1.29913844e-02
  -4.58019303e-03 -2.70093413e-02 -1.02683551e-03 -2.12796323e-02
  -6.28906254e-03 -2.13508993e-02 -1.31002531e-02  8.53354900e-03
  -1.01119559e-03  2.53777572e-03 -2.73862488e-02  5.13986485e-04
   4.96337590e-02  2.01269116e-03 -1.23967219e-02  5.99338545e-03
  -7.39463771e-03 -5.15502493e-02 -5.69401986e-02  8.36397356e-02
   1.49669026e-02  1.92547717e-02 -2.40386153e-03 -1.55658920e-02
  -4.15788011e-03  3.33850113e-03 -3.07508400e-02  1.30070473e-02
  -1.13289582e-02  0.00000000e+00  3.45435182e-03 -3.95516509e-02
  -2.22982560e-03 -3.48309267e-02 -7.31128590e-03  5.96541863e-03
   3.61227183e-03 -8.78392391e-03  1.37834712e-03 -6.69092052e-03
  -3.15195770e-02 -1.38635405e-02  1.00959273e-02  2.76226658e-03
   2.03101528e-04 -2.18130586e-02 -2.55485614e-02 -1.04517004e-03
  -1.51418225e-02 -3.12647177e-02 -4.59101212e-03 -4.15483850e-03
  -6.87723051e-04  1.96908603e-03  6.93292563e-03 -6.31346357e-05
   1.53845338e-02  3.17507328e-03  5.29435121e-03 -3.44138044e-03
   4.45274632e-03 -3.24623830e-02 -6.33364079e-02 -4.90524973e-02
   3.21685419e-02  6.39206044e-03 -3.37718688e-04  9.44383014e-04
   9.85393602e-04 -2.24601861e-03  1.42482016e-02  3.59703203e-03
  -5.04924862e-03  1.10421679e-02  6.41647623e-03  1.01527416e-02
   9.67600197e-04 -1.01501912e-02 -1.23796737e-04  5.19361132e-03
   3.57315292e-02 -4.99324726e-03 -7.82866781e-03 -6.62093182e-03
   1.08253961e-02  9.45856053e-03  1.93665333e-02  5.37784913e-03
  -6.54653955e-03 -1.00714377e-02  2.64630398e-02 -5.85493233e-04
   2.56982708e-02  3.34737700e-02  1.35830052e-03  1.21957786e-02
  -1.46874351e-02 -2.31088823e-03  9.95648118e-03 -4.27291467e-03
   8.88768511e-03 -1.22108668e-03  1.05574328e-02 -2.87483606e-02
   1.85895012e-03  2.89736005e-04  1.30702131e-02 -2.03961361e-02
  -2.20166959e-02 -1.72379361e-02 -3.75650065e-02  1.55585526e-02
   2.79515661e-02  7.45882964e-03  2.13700843e-02  1.30877206e-02
   2.53230154e-02 -2.08090208e-02 -7.53651928e-03 -3.25048945e-03
   3.19500741e-02 -1.35874362e-03  2.12078402e-02 -1.43182828e-02
   9.31946228e-03  1.82984277e-03  1.18614817e-02 -1.77738308e-02
  -6.14562184e-03  3.37351295e-02 -8.87142891e-04  1.04445750e-02
  -1.90250165e-02 -2.29921638e-03  1.08701737e-02  3.93571522e-02
  -2.64140804e-02 -2.62594850e-02  9.20156567e-04  1.21844794e-03
   1.57740349e-02  5.68792667e-04  7.02158502e-03 -4.43659971e-02
  -1.70125699e-03  1.79322546e-02  2.69916694e-02  3.34130488e-02
   0.00000000e+00 -5.87417755e-03  2.45578056e-03  3.08521576e-03
  -6.98591714e-03]]
intercept: [-0.34531121]
In [49]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set = test_set.drop(columns=[*_cols_bin, *_cols_std])
test_set.shape
Out[49]:
(153993, 754)
In [50]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
predictions:
 [0 0 0 ... 0 0 0]
In [51]:
set(predic)
Out[51]:
{0, 1}
In [52]:
# evaluate classifier

print("Report for Support Vector Machine:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Support Vector Machine:", _accuracy_score(test_set['likes'], predic)*100)
Report for Support Vector Machine:
              precision    recall  f1-score   support

           0       0.55      0.49      0.52     50930
           1       0.76      0.81      0.78    103063

    accuracy                           0.70    153993
   macro avg       0.66      0.65      0.65    153993
weighted avg       0.69      0.70      0.69    153993

Accuracy for Support Vector Machine: 70.01292266531595
In [53]:
# Confusion matrix for SVC

print("Confusion Matrix for SVC: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for SVC: 
Out[53]:
array([[24734, 26196],
       [19982, 83081]], dtype=int64)
In [54]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("SVM ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [55]:
_del_all()

7.11 Random forest only real columns

In [56]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.shape
Out[56]:
(558386, 787)
In [57]:
train_set = train_set.drop(columns=[*_cols_bin, *_cols_std])
train_set.shape
Out[57]:
(558386, 754)
In [58]:
params = _jl.load("../models/best_Random_Forest_2.joblib").get_params()
params['n_jobs'] = -1
params['verbose'] = 10
best_model = _RandomForestClassifier(**params)
best_model.get_params()
Out[58]:
{'bootstrap': False,
 'class_weight': None,
 'criterion': 'entropy',
 'max_depth': 50,
 'max_features': 'auto',
 'max_leaf_nodes': None,
 'min_impurity_decrease': 0.0,
 'min_impurity_split': None,
 'min_samples_leaf': 2,
 'min_samples_split': 10,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 1000,
 'n_jobs': -1,
 'oob_score': False,
 'random_state': None,
 'verbose': 10,
 'warm_start': False}
In [59]:
best_model.fit(train_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 12 concurrent workers.
building tree 1 of 1000building tree 2 of 1000building tree 3 of 1000building tree 4 of 1000

building tree 5 of 1000
building tree 6 of 1000
building tree 7 of 1000
building tree 8 of 1000
building tree 9 of 1000
building tree 10 of 1000
building tree 11 of 1000
building tree 12 of 1000


building tree 13 of 1000
[Parallel(n_jobs=-1)]: Done   1 tasks      | elapsed:   21.4s
building tree 14 of 1000
building tree 15 of 1000
building tree 16 of 1000
building tree 17 of 1000
building tree 18 of 1000
building tree 19 of 1000
building tree 20 of 1000
building tree 21 of 1000
[Parallel(n_jobs=-1)]: Done   8 tasks      | elapsed:   24.0s
building tree 22 of 1000
building tree 23 of 1000
building tree 24 of 1000
building tree 25 of 1000
building tree 26 of 1000
building tree 27 of 1000
building tree 28 of 1000
[Parallel(n_jobs=-1)]: Done  17 tasks      | elapsed:   46.1s
building tree 29 of 1000
building tree 30 of 1000
building tree 31 of 1000
building tree 32 of 1000
building tree 33 of 1000
building tree 34 of 1000
building tree 35 of 1000
building tree 36 of 1000
building tree 37 of 1000
[Parallel(n_jobs=-1)]: Done  26 tasks      | elapsed:  1.1min
building tree 38 of 1000
building tree 39 of 1000
building tree 40 of 1000
building tree 41 of 1000
building tree 42 of 1000
building tree 43 of 1000
building tree 44 of 1000
building tree 45 of 1000
building tree 46 of 1000
building tree 47 of 1000
building tree 48 of 1000
[Parallel(n_jobs=-1)]: Done  37 tasks      | elapsed:  1.5min
building tree 49 of 1000
building tree 50 of 1000
building tree 51 of 1000
building tree 52 of 1000
building tree 53 of 1000
building tree 54 of 1000
building tree 55 of 1000
building tree 56 of 1000
building tree 57 of 1000
building tree 58 of 1000
building tree 59 of 1000
building tree 60 of 1000
[Parallel(n_jobs=-1)]: Done  48 tasks      | elapsed:  1.7min
building tree 61 of 1000
building tree 62 of 1000
building tree 63 of 1000
building tree 64 of 1000
building tree 65 of 1000
building tree 66 of 1000
building tree 67 of 1000
building tree 68 of 1000
building tree 69 of 1000
building tree 70 of 1000
building tree 71 of 1000
building tree 72 of 1000
[Parallel(n_jobs=-1)]: Done  61 tasks      | elapsed:  2.2min
building tree 73 of 1000
building tree 74 of 1000
building tree 75 of 1000
building tree 76 of 1000
building tree 77 of 1000
building tree 78 of 1000
building tree 79 of 1000
building tree 80 of 1000
building tree 81 of 1000
building tree 82 of 1000
building tree 83 of 1000
building tree 84 of 1000
building tree 85 of 1000
[Parallel(n_jobs=-1)]: Done  74 tasks      | elapsed:  2.7min
building tree 86 of 1000
building tree 87 of 1000
building tree 88 of 1000
building tree 89 of 1000
building tree 90 of 1000
building tree 91 of 1000
building tree 92 of 1000
building tree 93 of 1000
building tree 94 of 1000
building tree 95 of 1000
building tree 96 of 1000
building tree 97 of 1000
building tree 98 of 1000
building tree 99 of 1000
building tree 100 of 1000
[Parallel(n_jobs=-1)]: Done  89 tasks      | elapsed:  3.2min
building tree 101 of 1000
building tree 102 of 1000
building tree 103 of 1000
building tree 104 of 1000
building tree 105 of 1000
building tree 106 of 1000
building tree 107 of 1000
building tree 108 of 1000
building tree 109 of 1000
building tree 110 of 1000
building tree 111 of 1000
building tree 112 of 1000
building tree 113 of 1000
building tree 114 of 1000
building tree 115 of 1000building tree 116 of 1000

[Parallel(n_jobs=-1)]: Done 104 tasks      | elapsed:  3.6min
building tree 117 of 1000
building tree 118 of 1000
building tree 119 of 1000
building tree 120 of 1000
building tree 121 of 1000
building tree 122 of 1000
building tree 123 of 1000
building tree 124 of 1000
building tree 125 of 1000
building tree 126 of 1000
building tree 127 of 1000
building tree 128 of 1000
building tree 129 of 1000
building tree 130 of 1000
building tree 131 of 1000
building tree 132 of 1000
[Parallel(n_jobs=-1)]: Done 121 tasks      | elapsed:  4.2min
building tree 133 of 1000
building tree 134 of 1000
building tree 135 of 1000
building tree 136 of 1000
building tree 137 of 1000
building tree 138 of 1000
building tree 139 of 1000
building tree 140 of 1000
building tree 141 of 1000
building tree 142 of 1000
building tree 143 of 1000
building tree 144 of 1000
building tree 145 of 1000
building tree 146 of 1000
building tree 147 of 1000
building tree 148 of 1000
building tree 149 of 1000
building tree 150 of 1000
[Parallel(n_jobs=-1)]: Done 138 tasks      | elapsed:  4.7min
building tree 151 of 1000
building tree 152 of 1000
building tree 153 of 1000
building tree 154 of 1000
building tree 155 of 1000
building tree 156 of 1000
building tree 157 of 1000
building tree 158 of 1000
building tree 159 of 1000
building tree 160 of 1000
building tree 161 of 1000
building tree 162 of 1000
building tree 163 of 1000
building tree 164 of 1000
building tree 165 of 1000
building tree 166 of 1000
building tree 167 of 1000
building tree 168 of 1000
building tree 169 of 1000
[Parallel(n_jobs=-1)]: Done 157 tasks      | elapsed:  5.4min
building tree 170 of 1000
building tree 171 of 1000
building tree 172 of 1000
building tree 173 of 1000
building tree 174 of 1000
building tree 175 of 1000
building tree 176 of 1000
building tree 177 of 1000
building tree 178 of 1000
building tree 179 of 1000
building tree 180 of 1000
building tree 181 of 1000
building tree 182 of 1000
building tree 183 of 1000
building tree 184 of 1000
building tree 185 of 1000
building tree 186 of 1000
building tree 187 of 1000
building tree 188 of 1000
building tree 189 of 1000
[Parallel(n_jobs=-1)]: Done 176 tasks      | elapsed:  6.0min
building tree 190 of 1000
building tree 191 of 1000
building tree 192 of 1000
building tree 193 of 1000
building tree 194 of 1000
building tree 195 of 1000
building tree 196 of 1000
building tree 197 of 1000
building tree 198 of 1000
building tree 199 of 1000
building tree 200 of 1000
building tree 201 of 1000
building tree 202 of 1000
building tree 203 of 1000
building tree 204 of 1000
building tree 205 of 1000
building tree 206 of 1000
building tree 207 of 1000
building tree 208 of 1000
building tree 209 of 1000
[Parallel(n_jobs=-1)]: Done 197 tasks      | elapsed:  6.7min
building tree 210 of 1000
building tree 211 of 1000
building tree 212 of 1000
building tree 213 of 1000
building tree 214 of 1000
building tree 215 of 1000
building tree 216 of 1000
building tree 217 of 1000
building tree 218 of 1000
building tree 219 of 1000
building tree 220 of 1000
building tree 221 of 1000
building tree 222 of 1000
building tree 223 of 1000
building tree 224 of 1000
building tree 225 of 1000
building tree 226 of 1000
building tree 227 of 1000
building tree 228 of 1000
building tree 229 of 1000
[Parallel(n_jobs=-1)]: Done 218 tasks      | elapsed:  7.6min
building tree 230 of 1000
building tree 231 of 1000
building tree 232 of 1000
building tree 233 of 1000
building tree 234 of 1000
building tree 235 of 1000
building tree 236 of 1000
building tree 237 of 1000
building tree 238 of 1000
building tree 239 of 1000
building tree 240 of 1000
building tree 241 of 1000
building tree 242 of 1000
building tree 243 of 1000
building tree 244 of 1000
building tree 245 of 1000
building tree 246 of 1000
building tree 247 of 1000
building tree 248 of 1000
building tree 249 of 1000
building tree 250 of 1000
building tree 251 of 1000
building tree 252 of 1000
[Parallel(n_jobs=-1)]: Done 241 tasks      | elapsed:  8.3min
building tree 253 of 1000
building tree 254 of 1000
building tree 255 of 1000
building tree 256 of 1000
building tree 257 of 1000
building tree 258 of 1000
building tree 259 of 1000
building tree 260 of 1000
building tree 261 of 1000
building tree 262 of 1000
building tree 263 of 1000
building tree 264 of 1000
building tree 265 of 1000
building tree 266 of 1000
building tree 267 of 1000
building tree 268 of 1000
building tree 269 of 1000
building tree 270 of 1000
building tree 271 of 1000
building tree 272 of 1000
building tree 273 of 1000
building tree 274 of 1000
building tree 275 of 1000
[Parallel(n_jobs=-1)]: Done 264 tasks      | elapsed:  9.1min
building tree 276 of 1000
building tree 277 of 1000
building tree 278 of 1000
building tree 279 of 1000
building tree 280 of 1000
building tree 281 of 1000
building tree 282 of 1000
building tree 283 of 1000
building tree 284 of 1000
building tree 285 of 1000
building tree 286 of 1000
building tree 287 of 1000
building tree 288 of 1000
building tree 289 of 1000
building tree 290 of 1000
building tree 291 of 1000
building tree 292 of 1000
building tree 293 of 1000
building tree 294 of 1000
building tree 295 of 1000
building tree 296 of 1000
building tree 297 of 1000
building tree 298 of 1000
building tree 299 of 1000
building tree 300 of 1000
building tree 301 of 1000
[Parallel(n_jobs=-1)]: Done 289 tasks      | elapsed:  9.9min
building tree 302 of 1000
building tree 303 of 1000
building tree 304 of 1000
building tree 305 of 1000
building tree 306 of 1000
building tree 307 of 1000
building tree 308 of 1000
building tree 309 of 1000
building tree 310 of 1000
building tree 311 of 1000
building tree 312 of 1000
building tree 313 of 1000
building tree 314 of 1000
building tree 315 of 1000
building tree 316 of 1000
building tree 317 of 1000
building tree 318 of 1000
building tree 319 of 1000
building tree 320 of 1000
building tree 321 of 1000
building tree 322 of 1000
building tree 323 of 1000
building tree 324 of 1000
building tree 325 of 1000
[Parallel(n_jobs=-1)]: Done 314 tasks      | elapsed: 10.8min
building tree 326 of 1000
building tree 327 of 1000
building tree 328 of 1000
building tree 329 of 1000
building tree 330 of 1000
building tree 331 of 1000
building tree 332 of 1000
building tree 333 of 1000
building tree 334 of 1000
building tree 335 of 1000
building tree 336 of 1000
building tree 337 of 1000
building tree 338 of 1000
building tree 339 of 1000
building tree 340 of 1000
building tree 341 of 1000
building tree 342 of 1000
building tree 343 of 1000
building tree 344 of 1000
building tree 345 of 1000
building tree 346 of 1000
building tree 347 of 1000
building tree 348 of 1000
building tree 349 of 1000
building tree 350 of 1000
building tree 351 of 1000
building tree 352 of 1000
[Parallel(n_jobs=-1)]: Done 341 tasks      | elapsed: 11.6min
building tree 353 of 1000
building tree 354 of 1000
building tree 355 of 1000
building tree 356 of 1000
building tree 357 of 1000
building tree 358 of 1000
building tree 359 of 1000
building tree 360 of 1000
building tree 361 of 1000
building tree 362 of 1000
building tree 363 of 1000
building tree 364 of 1000
building tree 365 of 1000
building tree 366 of 1000
building tree 367 of 1000
building tree 368 of 1000
building tree 369 of 1000
building tree 370 of 1000
building tree 371 of 1000
building tree 372 of 1000
building tree 373 of 1000
building tree 374 of 1000
building tree 375 of 1000
building tree 376 of 1000
building tree 377 of 1000
building tree 378 of 1000
building tree 379 of 1000
building tree 380 of 1000
[Parallel(n_jobs=-1)]: Done 368 tasks      | elapsed: 12.5min
building tree 381 of 1000
building tree 382 of 1000
building tree 383 of 1000
building tree 384 of 1000
building tree 385 of 1000
building tree 386 of 1000
building tree 387 of 1000
building tree 388 of 1000
building tree 389 of 1000
building tree 390 of 1000
building tree 391 of 1000
building tree 392 of 1000
building tree 393 of 1000
building tree 394 of 1000
building tree 395 of 1000
building tree 396 of 1000
building tree 397 of 1000
building tree 398 of 1000
building tree 399 of 1000
building tree 400 of 1000
building tree 401 of 1000
building tree 402 of 1000
building tree 403 of 1000
building tree 404 of 1000
building tree 405 of 1000
building tree 406 of 1000
building tree 407 of 1000
building tree 408 of 1000
[Parallel(n_jobs=-1)]: Done 397 tasks      | elapsed: 13.5min
building tree 409 of 1000
building tree 410 of 1000
building tree 411 of 1000
building tree 412 of 1000
building tree 413 of 1000
building tree 414 of 1000
building tree 415 of 1000
building tree 416 of 1000
building tree 417 of 1000
building tree 418 of 1000
building tree 419 of 1000
building tree 420 of 1000
building tree 421 of 1000
building tree 422 of 1000
building tree 423 of 1000
building tree 424 of 1000
building tree 425 of 1000
building tree 426 of 1000
building tree 427 of 1000
building tree 428 of 1000
building tree 429 of 1000
building tree 430 of 1000
building tree 431 of 1000
building tree 432 of 1000
building tree 433 of 1000
building tree 434 of 1000
building tree 435 of 1000
building tree 436 of 1000
building tree 437 of 1000
building tree 438 of 1000
building tree 439 of 1000
[Parallel(n_jobs=-1)]: Done 426 tasks      | elapsed: 14.4min
building tree 440 of 1000
building tree 441 of 1000
building tree 442 of 1000
building tree 443 of 1000
building tree 444 of 1000
building tree 445 of 1000
building tree 446 of 1000
building tree 447 of 1000
building tree 448 of 1000
building tree 449 of 1000
building tree 450 of 1000
building tree 451 of 1000
building tree 452 of 1000
building tree 453 of 1000
building tree 454 of 1000
building tree 455 of 1000
building tree 456 of 1000
building tree 457 of 1000
building tree 458 of 1000
building tree 459 of 1000
building tree 460 of 1000
building tree 461 of 1000
building tree 462 of 1000
building tree 463 of 1000
building tree 464 of 1000
building tree 465 of 1000
building tree 466 of 1000
building tree 467 of 1000
building tree 468 of 1000
[Parallel(n_jobs=-1)]: Done 457 tasks      | elapsed: 15.5min
building tree 469 of 1000
building tree 470 of 1000
building tree 471 of 1000
building tree 472 of 1000
building tree 473 of 1000
building tree 474 of 1000
building tree 475 of 1000
building tree 476 of 1000
building tree 477 of 1000
building tree 478 of 1000
building tree 479 of 1000
building tree 480 of 1000
building tree 481 of 1000
building tree 482 of 1000
building tree 483 of 1000
building tree 484 of 1000
building tree 485 of 1000
building tree 486 of 1000
building tree 487 of 1000
building tree 488 of 1000
building tree 489 of 1000
building tree 490 of 1000
building tree 491 of 1000
building tree 492 of 1000
building tree 493 of 1000
building tree 494 of 1000
building tree 495 of 1000
building tree 496 of 1000
building tree 497 of 1000
building tree 498 of 1000
building tree 499 of 1000
building tree 500 of 1000
[Parallel(n_jobs=-1)]: Done 488 tasks      | elapsed: 16.5min
building tree 501 of 1000
building tree 502 of 1000
building tree 503 of 1000
building tree 504 of 1000
building tree 505 of 1000
building tree 506 of 1000
building tree 507 of 1000
building tree 508 of 1000
building tree 509 of 1000
building tree 510 of 1000
building tree 511 of 1000
building tree 512 of 1000
building tree 513 of 1000
building tree 514 of 1000
building tree 515 of 1000
building tree 516 of 1000
building tree 517 of 1000
building tree 518 of 1000
building tree 519 of 1000
building tree 520 of 1000
building tree 521 of 1000
building tree 522 of 1000
building tree 523 of 1000
building tree 524 of 1000
building tree 525 of 1000
building tree 526 of 1000
building tree 527 of 1000
building tree 528 of 1000
building tree 529 of 1000
building tree 530 of 1000
building tree 531 of 1000
building tree 532 of 1000
[Parallel(n_jobs=-1)]: Done 521 tasks      | elapsed: 17.7min
building tree 533 of 1000
building tree 534 of 1000
building tree 535 of 1000
building tree 536 of 1000
building tree 537 of 1000
building tree 538 of 1000
building tree 539 of 1000
building tree 540 of 1000
building tree 541 of 1000
building tree 542 of 1000
building tree 543 of 1000
building tree 544 of 1000
building tree 545 of 1000
building tree 546 of 1000
building tree 547 of 1000
building tree 548 of 1000
building tree 549 of 1000
building tree 550 of 1000
building tree 551 of 1000
building tree 552 of 1000
building tree 553 of 1000
building tree 554 of 1000
building tree 555 of 1000
building tree 556 of 1000
building tree 557 of 1000
building tree 558 of 1000
building tree 559 of 1000
building tree 560 of 1000
building tree 561 of 1000
building tree 562 of 1000
building tree 563 of 1000
building tree 564 of 1000
building tree 565 of 1000
[Parallel(n_jobs=-1)]: Done 554 tasks      | elapsed: 18.7min
building tree 566 of 1000
building tree 567 of 1000
building tree 568 of 1000
building tree 569 of 1000
building tree 570 of 1000
building tree 571 of 1000
building tree 572 of 1000
building tree 573 of 1000
building tree 574 of 1000
building tree 575 of 1000
building tree 576 of 1000
building tree 577 of 1000
building tree 578 of 1000
building tree 579 of 1000
building tree 580 of 1000
building tree 581 of 1000
building tree 582 of 1000
building tree 583 of 1000
building tree 584 of 1000
building tree 585 of 1000
building tree 586 of 1000
building tree 587 of 1000
building tree 588 of 1000
building tree 589 of 1000
building tree 590 of 1000
building tree 591 of 1000
building tree 592 of 1000
building tree 593 of 1000
building tree 594 of 1000
building tree 595 of 1000
building tree 596 of 1000
building tree 597 of 1000
building tree 598 of 1000
building tree 599 of 1000
building tree 600 of 1000
building tree 601 of 1000
[Parallel(n_jobs=-1)]: Done 589 tasks      | elapsed: 19.9min
building tree 602 of 1000
building tree 603 of 1000
building tree 604 of 1000
building tree 605 of 1000
building tree 606 of 1000
building tree 607 of 1000
building tree 608 of 1000
building tree 609 of 1000
building tree 610 of 1000
building tree 611 of 1000
building tree 612 of 1000
building tree 613 of 1000
building tree 614 of 1000
building tree 615 of 1000
building tree 616 of 1000
building tree 617 of 1000
building tree 618 of 1000
building tree 619 of 1000
building tree 620 of 1000
building tree 621 of 1000
building tree 622 of 1000
building tree 623 of 1000
building tree 624 of 1000
building tree 625 of 1000
building tree 626 of 1000
building tree 627 of 1000
building tree 628 of 1000
building tree 629 of 1000
building tree 630 of 1000
building tree 631 of 1000
building tree 632 of 1000
building tree 633 of 1000
building tree 634 of 1000
building tree 635 of 1000
[Parallel(n_jobs=-1)]: Done 624 tasks      | elapsed: 21.0min
building tree 636 of 1000
building tree 637 of 1000
building tree 638 of 1000
building tree 639 of 1000
building tree 640 of 1000
building tree 641 of 1000
building tree 642 of 1000
building tree 643 of 1000
building tree 644 of 1000
building tree 645 of 1000
building tree 646 of 1000
building tree 647 of 1000
building tree 648 of 1000
building tree 649 of 1000
building tree 650 of 1000
building tree 651 of 1000
building tree 652 of 1000
building tree 653 of 1000
building tree 654 of 1000
building tree 655 of 1000
building tree 656 of 1000
building tree 657 of 1000
building tree 658 of 1000
building tree 659 of 1000
building tree 660 of 1000
building tree 661 of 1000
building tree 662 of 1000
building tree 663 of 1000
building tree 664 of 1000
building tree 665 of 1000
building tree 666 of 1000
building tree 667 of 1000
building tree 668 of 1000
building tree 669 of 1000
building tree 670 of 1000
building tree 671 of 1000
building tree 672 of 1000
building tree 673 of 1000
[Parallel(n_jobs=-1)]: Done 661 tasks      | elapsed: 22.1min
building tree 674 of 1000
building tree 675 of 1000
building tree 676 of 1000
building tree 677 of 1000
building tree 678 of 1000
building tree 679 of 1000
building tree 680 of 1000
building tree 681 of 1000
building tree 682 of 1000
building tree 683 of 1000
building tree 684 of 1000
building tree 685 of 1000
building tree 686 of 1000
building tree 687 of 1000
building tree 688 of 1000
building tree 689 of 1000
building tree 690 of 1000
building tree 691 of 1000
building tree 692 of 1000
building tree 693 of 1000
building tree 694 of 1000
building tree 695 of 1000
building tree 696 of 1000
building tree 697 of 1000
building tree 698 of 1000
building tree 699 of 1000
building tree 700 of 1000
building tree 701 of 1000
building tree 702 of 1000
building tree 703 of 1000
building tree 704 of 1000
building tree 705 of 1000
building tree 706 of 1000
building tree 707 of 1000
building tree 708 of 1000
building tree 709 of 1000
[Parallel(n_jobs=-1)]: Done 698 tasks      | elapsed: 23.4min
building tree 710 of 1000
building tree 711 of 1000
building tree 712 of 1000
building tree 713 of 1000
building tree 714 of 1000
building tree 715 of 1000
building tree 716 of 1000
building tree 717 of 1000
building tree 718 of 1000
building tree 719 of 1000
building tree 720 of 1000
building tree 721 of 1000
building tree 722 of 1000
building tree 723 of 1000
building tree 724 of 1000
building tree 725 of 1000
building tree 726 of 1000
building tree 727 of 1000
building tree 728 of 1000
building tree 729 of 1000
building tree 730 of 1000
building tree 731 of 1000
building tree 732 of 1000
building tree 733 of 1000
building tree 734 of 1000
building tree 735 of 1000
building tree 736 of 1000
building tree 737 of 1000
building tree 738 of 1000
building tree 739 of 1000
building tree 740 of 1000
building tree 741 of 1000
building tree 742 of 1000
building tree 743 of 1000
building tree 744 of 1000
building tree 745 of 1000
building tree 746 of 1000
building tree 747 of 1000
building tree 748 of 1000
[Parallel(n_jobs=-1)]: Done 737 tasks      | elapsed: 24.7min
building tree 749 of 1000
building tree 750 of 1000
building tree 751 of 1000
building tree 752 of 1000
building tree 753 of 1000
building tree 754 of 1000
building tree 755 of 1000
building tree 756 of 1000
building tree 757 of 1000
building tree 758 of 1000
building tree 759 of 1000
building tree 760 of 1000
building tree 761 of 1000
building tree 762 of 1000
building tree 763 of 1000
building tree 764 of 1000
building tree 765 of 1000
building tree 766 of 1000
building tree 767 of 1000
building tree 768 of 1000
building tree 769 of 1000
building tree 770 of 1000
building tree 771 of 1000
building tree 772 of 1000
building tree 773 of 1000
building tree 774 of 1000
building tree 775 of 1000
building tree 776 of 1000
building tree 777 of 1000
building tree 778 of 1000
building tree 779 of 1000
building tree 780 of 1000
building tree 781 of 1000
building tree 782 of 1000
building tree 783 of 1000
building tree 784 of 1000
building tree 785 of 1000
building tree 786 of 1000
building tree 787 of 1000
building tree 788 of 1000
[Parallel(n_jobs=-1)]: Done 776 tasks      | elapsed: 26.0min
building tree 789 of 1000
building tree 790 of 1000
building tree 791 of 1000
building tree 792 of 1000
building tree 793 of 1000
building tree 794 of 1000
building tree 795 of 1000
building tree 796 of 1000
building tree 797 of 1000
building tree 798 of 1000
building tree 799 of 1000
building tree 800 of 1000
building tree 801 of 1000
building tree 802 of 1000
building tree 803 of 1000
building tree 804 of 1000
building tree 805 of 1000
building tree 806 of 1000
building tree 807 of 1000
building tree 808 of 1000
building tree 809 of 1000
building tree 810 of 1000
building tree 811 of 1000
building tree 812 of 1000
building tree 813 of 1000
building tree 814 of 1000
building tree 815 of 1000
building tree 816 of 1000
building tree 817 of 1000
building tree 818 of 1000
building tree 819 of 1000
building tree 820 of 1000
building tree 821 of 1000
building tree 822 of 1000
building tree 823 of 1000
building tree 824 of 1000
building tree 825 of 1000
building tree 826 of 1000
building tree 827 of 1000
building tree 828 of 1000
[Parallel(n_jobs=-1)]: Done 817 tasks      | elapsed: 27.4min
building tree 829 of 1000
building tree 830 of 1000
building tree 831 of 1000
building tree 832 of 1000
building tree 833 of 1000
building tree 834 of 1000
building tree 835 of 1000
building tree 836 of 1000
building tree 837 of 1000
building tree 838 of 1000
building tree 839 of 1000
building tree 840 of 1000
building tree 841 of 1000
building tree 842 of 1000
building tree 843 of 1000
building tree 844 of 1000
building tree 845 of 1000
building tree 846 of 1000
building tree 847 of 1000
building tree 848 of 1000
building tree 849 of 1000
building tree 850 of 1000
building tree 851 of 1000
building tree 852 of 1000
building tree 853 of 1000
building tree 854 of 1000
building tree 855 of 1000
building tree 856 of 1000
building tree 857 of 1000
building tree 858 of 1000building tree 859 of 1000

building tree 860 of 1000
building tree 861 of 1000
building tree 862 of 1000
building tree 863 of 1000
building tree 864 of 1000
building tree 865 of 1000
building tree 866 of 1000
building tree 867 of 1000
building tree 868 of 1000
building tree 869 of 1000
[Parallel(n_jobs=-1)]: Done 858 tasks      | elapsed: 28.7min
building tree 870 of 1000
building tree 871 of 1000
building tree 872 of 1000
building tree 873 of 1000
building tree 874 of 1000
building tree 875 of 1000
building tree 876 of 1000
building tree 877 of 1000
building tree 878 of 1000
building tree 879 of 1000
building tree 880 of 1000
building tree 881 of 1000
building tree 882 of 1000
building tree 883 of 1000
building tree 884 of 1000
building tree 885 of 1000
building tree 886 of 1000
building tree 887 of 1000
building tree 888 of 1000
building tree 889 of 1000
building tree 890 of 1000
building tree 891 of 1000
building tree 892 of 1000
building tree 893 of 1000
building tree 894 of 1000building tree 895 of 1000

building tree 896 of 1000
building tree 897 of 1000
building tree 898 of 1000
building tree 899 of 1000
building tree 900 of 1000
building tree 901 of 1000
building tree 902 of 1000
building tree 903 of 1000
building tree 904 of 1000
building tree 905 of 1000
building tree 906 of 1000
building tree 907 of 1000
building tree 908 of 1000
building tree 909 of 1000
building tree 910 of 1000
building tree 911 of 1000
building tree 912 of 1000
[Parallel(n_jobs=-1)]: Done 901 tasks      | elapsed: 30.2min
building tree 913 of 1000
building tree 914 of 1000
building tree 915 of 1000
building tree 916 of 1000
building tree 917 of 1000
building tree 918 of 1000
building tree 919 of 1000
building tree 920 of 1000
building tree 921 of 1000
building tree 922 of 1000
building tree 923 of 1000
building tree 924 of 1000
building tree 925 of 1000
building tree 926 of 1000
building tree 927 of 1000
building tree 928 of 1000
building tree 929 of 1000
building tree 930 of 1000
building tree 931 of 1000
building tree 932 of 1000
building tree 933 of 1000
building tree 934 of 1000
building tree 935 of 1000
building tree 936 of 1000
building tree 937 of 1000
building tree 938 of 1000
building tree 939 of 1000
building tree 940 of 1000
building tree 941 of 1000
building tree 942 of 1000
building tree 943 of 1000
building tree 944 of 1000
building tree 945 of 1000
building tree 946 of 1000
building tree 947 of 1000
building tree 948 of 1000
building tree 949 of 1000
building tree 950 of 1000
building tree 951 of 1000
building tree 952 of 1000
building tree 953 of 1000
building tree 954 of 1000
building tree 955 of 1000
[Parallel(n_jobs=-1)]: Done 944 tasks      | elapsed: 31.6min
building tree 956 of 1000
building tree 957 of 1000
building tree 958 of 1000
building tree 959 of 1000
building tree 960 of 1000
building tree 961 of 1000
building tree 962 of 1000
building tree 963 of 1000
building tree 964 of 1000
building tree 965 of 1000
building tree 966 of 1000
building tree 967 of 1000
building tree 968 of 1000
building tree 969 of 1000
building tree 970 of 1000
building tree 971 of 1000
building tree 972 of 1000
building tree 973 of 1000
building tree 974 of 1000
building tree 975 of 1000
building tree 976 of 1000
building tree 977 of 1000
building tree 978 of 1000
building tree 979 of 1000
building tree 980 of 1000
building tree 981 of 1000
building tree 982 of 1000
building tree 983 of 1000
building tree 984 of 1000
building tree 985 of 1000
building tree 986 of 1000
building tree 987 of 1000
building tree 988 of 1000
building tree 989 of 1000
building tree 990 of 1000
building tree 991 of 1000
building tree 992 of 1000
building tree 993 of 1000
building tree 994 of 1000
building tree 995 of 1000
building tree 996 of 1000
building tree 997 of 1000
building tree 998 of 1000
building tree 999 of 1000
building tree 1000 of 1000
[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed: 33.3min finished
Out[59]:
RandomForestClassifier(bootstrap=False, class_weight=None, criterion='entropy',
                       max_depth=50, max_features='auto', max_leaf_nodes=None,
                       min_impurity_decrease=0.0, min_impurity_split=None,
                       min_samples_leaf=2, min_samples_split=10,
                       min_weight_fraction_leaf=0.0, n_estimators=1000,
                       n_jobs=-1, oob_score=False, random_state=None,
                       verbose=10, warm_start=False)
In [60]:
_jl.dump(best_model, "../models/best_Random_Forest_real.joblib")
Out[60]:
['../models/best_Random_Forest_real.joblib']
In [61]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set = test_set.drop(columns=[*_cols_bin, *_cols_std])
test_set.shape
Out[61]:
(153993, 754)
In [62]:
# test classifier
predic = best_model.predict(test_set.drop(columns=['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
print("predictions:\n", predic)
[Parallel(n_jobs=12)]: Using backend ThreadingBackend with 12 concurrent workers.
[Parallel(n_jobs=12)]: Done   1 tasks      | elapsed:    0.2s
[Parallel(n_jobs=12)]: Done   8 tasks      | elapsed:    0.2s
[Parallel(n_jobs=12)]: Done  17 tasks      | elapsed:    0.6s
[Parallel(n_jobs=12)]: Done  26 tasks      | elapsed:    0.9s
[Parallel(n_jobs=12)]: Done  37 tasks      | elapsed:    1.1s
[Parallel(n_jobs=12)]: Done  48 tasks      | elapsed:    1.4s
[Parallel(n_jobs=12)]: Done  61 tasks      | elapsed:    1.8s
[Parallel(n_jobs=12)]: Done  74 tasks      | elapsed:    2.1s
[Parallel(n_jobs=12)]: Done  89 tasks      | elapsed:    2.4s
[Parallel(n_jobs=12)]: Done 104 tasks      | elapsed:    2.7s
[Parallel(n_jobs=12)]: Done 121 tasks      | elapsed:    3.2s
[Parallel(n_jobs=12)]: Done 138 tasks      | elapsed:    3.6s
[Parallel(n_jobs=12)]: Done 157 tasks      | elapsed:    4.1s
[Parallel(n_jobs=12)]: Done 176 tasks      | elapsed:    4.5s
[Parallel(n_jobs=12)]: Done 197 tasks      | elapsed:    5.1s
[Parallel(n_jobs=12)]: Done 218 tasks      | elapsed:    5.6s
[Parallel(n_jobs=12)]: Done 241 tasks      | elapsed:    6.1s
[Parallel(n_jobs=12)]: Done 264 tasks      | elapsed:    6.7s
[Parallel(n_jobs=12)]: Done 289 tasks      | elapsed:    7.3s
[Parallel(n_jobs=12)]: Done 314 tasks      | elapsed:    7.9s
[Parallel(n_jobs=12)]: Done 341 tasks      | elapsed:    8.7s
[Parallel(n_jobs=12)]: Done 368 tasks      | elapsed:    9.3s
[Parallel(n_jobs=12)]: Done 397 tasks      | elapsed:   10.0s
[Parallel(n_jobs=12)]: Done 426 tasks      | elapsed:   10.7s
[Parallel(n_jobs=12)]: Done 457 tasks      | elapsed:   11.5s
[Parallel(n_jobs=12)]: Done 488 tasks      | elapsed:   12.2s
[Parallel(n_jobs=12)]: Done 521 tasks      | elapsed:   13.0s
[Parallel(n_jobs=12)]: Done 554 tasks      | elapsed:   13.8s
[Parallel(n_jobs=12)]: Done 589 tasks      | elapsed:   14.6s
[Parallel(n_jobs=12)]: Done 624 tasks      | elapsed:   15.5s
[Parallel(n_jobs=12)]: Done 661 tasks      | elapsed:   16.4s
[Parallel(n_jobs=12)]: Done 698 tasks      | elapsed:   17.2s
[Parallel(n_jobs=12)]: Done 737 tasks      | elapsed:   18.1s
[Parallel(n_jobs=12)]: Done 776 tasks      | elapsed:   19.1s
[Parallel(n_jobs=12)]: Done 817 tasks      | elapsed:   20.0s
[Parallel(n_jobs=12)]: Done 858 tasks      | elapsed:   21.0s
[Parallel(n_jobs=12)]: Done 901 tasks      | elapsed:   22.1s
[Parallel(n_jobs=12)]: Done 944 tasks      | elapsed:   23.0s
[Parallel(n_jobs=12)]: Done 1000 out of 1000 | elapsed:   24.3s finished
predictions:
 [0 0 1 ... 0 1 0]
In [63]:
set(predic)
Out[63]:
{0, 1}
In [64]:
# evaluate classifier

print("Report for Random Forest classifier:")
print(_classification_report(test_set['likes'], predic))

print("Accuracy for Random Forest Classifier:", _accuracy_score(test_set['likes'], predic)*100)
Report for Random Forest classifier:
              precision    recall  f1-score   support

           0       0.70      0.39      0.50     50930
           1       0.75      0.92      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.73      0.65      0.67    153993
weighted avg       0.74      0.74      0.72    153993

Accuracy for Random Forest Classifier: 74.34039209574462
In [65]:
# Confusion matrix for Random Forest

print("Confusion Matrix for Random Forest: ")
_confusion_matrix(test_set['likes'], predic)
Confusion Matrix for Random Forest: 
Out[65]:
array([[20055, 30875],
       [ 8639, 94424]], dtype=int64)
In [66]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], predic)

_plt.plot(fpr,tpr)
_plt.xlim([0.0,1.0])
_plt.ylim([0.0,1.0])

_plt.title("Random Forest ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [67]:
_del_all()

7.12 Neural Network only real columns

In [68]:
train_set = _pd.read_pickle('../dataset/m2_n9/model_train_set_3.pickle')
train_set.shape
Out[68]:
(558386, 787)
In [69]:
train_set = train_set.drop(columns=[*_cols_bin, *_cols_std])
train_set.shape
Out[69]:
(558386, 754)
In [70]:
# Number of neurons for hiddens layer
number_samples, number_features = train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']).shape

number_input_neurons = number_features
number_output_neurons = 1
number_train_semples = number_samples
alpha = 7

number_hidden_neurons = round(number_train_semples / (alpha * (number_input_neurons + number_output_neurons)))
print(number_hidden_neurons)
106
In [71]:
classifier = _Sequential()

# First Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal', input_dim = number_features))

# Second Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Third Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fourth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Fifth Hidden Layer
classifier.add(_Dense(number_hidden_neurons, activation = 'relu', kernel_initializer = 'random_normal'))

# Output Layer
classifier.add(_Dense(1, activation = 'sigmoid', kernel_initializer = 'random_normal'))

# Compiling the neural network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
In [72]:
# Fitting the data to the training dataset
classifier.fit(train_set.drop(
    columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']),
    train_set['likes'], validation_split = 0.3, batch_size = 100, epochs = 100)
Train on 390870 samples, validate on 167516 samples
Epoch 1/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5946 - acc: 0.6956 - val_loss: 0.5614 - val_acc: 0.7157
Epoch 2/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5553 - acc: 0.7237 - val_loss: 0.5431 - val_acc: 0.7351
Epoch 3/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5458 - acc: 0.7312 - val_loss: 0.5414 - val_acc: 0.7347
Epoch 4/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5414 - acc: 0.7351 - val_loss: 0.5421 - val_acc: 0.7341
Epoch 5/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5393 - acc: 0.7358 - val_loss: 0.5328 - val_acc: 0.7411
Epoch 6/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5361 - acc: 0.7378 - val_loss: 0.5395 - val_acc: 0.7355
Epoch 7/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5336 - acc: 0.7393 - val_loss: 0.5511 - val_acc: 0.7259
Epoch 8/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5327 - acc: 0.7404 - val_loss: 0.5319 - val_acc: 0.7417
Epoch 9/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5313 - acc: 0.7411 - val_loss: 0.5283 - val_acc: 0.7466
Epoch 10/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5300 - acc: 0.7423 - val_loss: 0.5301 - val_acc: 0.7469
Epoch 11/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5289 - acc: 0.7431 - val_loss: 0.5323 - val_acc: 0.7391
Epoch 12/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5282 - acc: 0.7433 - val_loss: 0.5353 - val_acc: 0.7420
Epoch 13/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5273 - acc: 0.7442 - val_loss: 0.5278 - val_acc: 0.7451
Epoch 14/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5261 - acc: 0.7449 - val_loss: 0.5282 - val_acc: 0.7424
Epoch 15/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5255 - acc: 0.7452 - val_loss: 0.5283 - val_acc: 0.7447
Epoch 16/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5248 - acc: 0.7460 - val_loss: 0.5224 - val_acc: 0.7485
Epoch 17/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5240 - acc: 0.7463 - val_loss: 0.5294 - val_acc: 0.7475
Epoch 18/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5233 - acc: 0.7464 - val_loss: 0.5351 - val_acc: 0.7403
Epoch 19/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5227 - acc: 0.7473 - val_loss: 0.5269 - val_acc: 0.7449
Epoch 20/100
390870/390870 [==============================] - 30s 76us/step - loss: 0.5221 - acc: 0.7478 - val_loss: 0.5241 - val_acc: 0.7486
Epoch 21/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5221 - acc: 0.7480 - val_loss: 0.5232 - val_acc: 0.7488
Epoch 22/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5209 - acc: 0.7480 - val_loss: 0.5202 - val_acc: 0.7505
Epoch 23/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5214 - acc: 0.7482 - val_loss: 0.5321 - val_acc: 0.7448
Epoch 24/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5197 - acc: 0.7491 - val_loss: 0.5218 - val_acc: 0.7495
Epoch 25/100
390870/390870 [==============================] - 30s 77us/step - loss: 0.5194 - acc: 0.7486 - val_loss: 0.5312 - val_acc: 0.7426
Epoch 26/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5186 - acc: 0.7492 - val_loss: 0.5195 - val_acc: 0.7495
Epoch 27/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5187 - acc: 0.7494 - val_loss: 0.5298 - val_acc: 0.7440
Epoch 28/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5180 - acc: 0.7498 - val_loss: 0.5346 - val_acc: 0.7420
Epoch 29/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5180 - acc: 0.7494 - val_loss: 0.5221 - val_acc: 0.7502
Epoch 30/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5176 - acc: 0.7495 - val_loss: 0.5265 - val_acc: 0.7498
Epoch 31/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5173 - acc: 0.7504 - val_loss: 0.5224 - val_acc: 0.7479
Epoch 32/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5174 - acc: 0.7503 - val_loss: 0.5295 - val_acc: 0.7468
Epoch 33/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5172 - acc: 0.7500 - val_loss: 0.5182 - val_acc: 0.7502
Epoch 34/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5171 - acc: 0.7502 - val_loss: 0.5211 - val_acc: 0.7487
Epoch 35/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5165 - acc: 0.7501 - val_loss: 0.5218 - val_acc: 0.7477
Epoch 36/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5168 - acc: 0.7505 - val_loss: 0.5328 - val_acc: 0.7411
Epoch 37/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5158 - acc: 0.7509 - val_loss: 0.5214 - val_acc: 0.7514
Epoch 38/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5151 - acc: 0.7518 - val_loss: 0.5209 - val_acc: 0.7505
Epoch 39/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5148 - acc: 0.7513 - val_loss: 0.5233 - val_acc: 0.7490
Epoch 40/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5147 - acc: 0.7516 - val_loss: 0.5204 - val_acc: 0.7511
Epoch 41/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5145 - acc: 0.7518 - val_loss: 0.5220 - val_acc: 0.7513
Epoch 42/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5154 - acc: 0.7507 - val_loss: 0.5218 - val_acc: 0.7492
Epoch 43/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5195 - acc: 0.7483 - val_loss: 0.5212 - val_acc: 0.7476
Epoch 44/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5140 - acc: 0.7519 - val_loss: 0.5210 - val_acc: 0.7519
Epoch 45/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5137 - acc: 0.7521 - val_loss: 0.5206 - val_acc: 0.7504
Epoch 46/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5133 - acc: 0.7524 - val_loss: 0.5250 - val_acc: 0.7488
Epoch 47/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5138 - acc: 0.7519 - val_loss: 0.5187 - val_acc: 0.7510
Epoch 48/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5134 - acc: 0.7523 - val_loss: 0.5319 - val_acc: 0.7428
Epoch 49/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5132 - acc: 0.7525 - val_loss: 0.5208 - val_acc: 0.7507
Epoch 50/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5155 - acc: 0.7523 - val_loss: 0.5569 - val_acc: 0.7311
Epoch 51/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5120 - acc: 0.7532 - val_loss: 0.5189 - val_acc: 0.7510
Epoch 52/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5120 - acc: 0.7526 - val_loss: 0.5177 - val_acc: 0.7522
Epoch 53/100
390870/390870 [==============================] - 29s 75us/step - loss: 0.5143 - acc: 0.7511 - val_loss: 0.5204 - val_acc: 0.7510
Epoch 54/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5175 - acc: 0.7495 - val_loss: 0.5230 - val_acc: 0.7493
Epoch 55/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5123 - acc: 0.7526 - val_loss: 0.5230 - val_acc: 0.7488
Epoch 56/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5128 - acc: 0.7524 - val_loss: 0.5350 - val_acc: 0.7442
Epoch 57/100
390870/390870 [==============================] - 28s 72us/step - loss: 0.5118 - acc: 0.7529 - val_loss: 0.5317 - val_acc: 0.7469
Epoch 58/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5110 - acc: 0.7542 - val_loss: 0.5214 - val_acc: 0.7471
Epoch 59/100
390870/390870 [==============================] - 28s 71us/step - loss: 0.5110 - acc: 0.7539 - val_loss: 0.5240 - val_acc: 0.7492
Epoch 60/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5103 - acc: 0.7543 - val_loss: 0.5316 - val_acc: 0.7407
Epoch 61/100
390870/390870 [==============================] - 27s 69us/step - loss: 0.5104 - acc: 0.7538 - val_loss: 0.5214 - val_acc: 0.7499
Epoch 62/100
390870/390870 [==============================] - 27s 68us/step - loss: 0.5103 - acc: 0.7541 - val_loss: 0.5351 - val_acc: 0.7428
Epoch 63/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5100 - acc: 0.7542 - val_loss: 0.5160 - val_acc: 0.7526
Epoch 64/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5102 - acc: 0.7543 - val_loss: 0.5191 - val_acc: 0.7527
Epoch 65/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5098 - acc: 0.7543 - val_loss: 0.5195 - val_acc: 0.7500
Epoch 66/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5098 - acc: 0.7541 - val_loss: 0.5231 - val_acc: 0.7504
Epoch 67/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5095 - acc: 0.7546 - val_loss: 0.5178 - val_acc: 0.7528
Epoch 68/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5092 - acc: 0.7546 - val_loss: 0.5310 - val_acc: 0.7401
Epoch 69/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5087 - acc: 0.7551 - val_loss: 0.5192 - val_acc: 0.7500
Epoch 70/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5090 - acc: 0.7546 - val_loss: 0.5217 - val_acc: 0.7496
Epoch 71/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5088 - acc: 0.7550 - val_loss: 0.5210 - val_acc: 0.7502
Epoch 72/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5087 - acc: 0.7549 - val_loss: 0.5282 - val_acc: 0.7473
Epoch 73/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5090 - acc: 0.7548 - val_loss: 0.5350 - val_acc: 0.7392
Epoch 74/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5076 - acc: 0.7550 - val_loss: 0.5198 - val_acc: 0.7528
Epoch 75/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5092 - acc: 0.7553 - val_loss: 0.5231 - val_acc: 0.7496
Epoch 76/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5074 - acc: 0.7556 - val_loss: 0.5198 - val_acc: 0.7512
Epoch 77/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5079 - acc: 0.7553 - val_loss: 0.5212 - val_acc: 0.7504
Epoch 78/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5110 - acc: 0.7550 - val_loss: 0.5387 - val_acc: 0.7505
Epoch 79/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5082 - acc: 0.7559 - val_loss: 0.5262 - val_acc: 0.7460
Epoch 80/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5081 - acc: 0.7551 - val_loss: 0.5198 - val_acc: 0.7502
Epoch 81/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5176 - acc: 0.7486 - val_loss: 0.5226 - val_acc: 0.7471
Epoch 82/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5117 - acc: 0.7525 - val_loss: 0.5283 - val_acc: 0.7431
Epoch 83/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5130 - acc: 0.7523 - val_loss: 0.5443 - val_acc: 0.7500
Epoch 84/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5188 - acc: 0.7455 - val_loss: 0.5171 - val_acc: 0.7527
Epoch 85/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5097 - acc: 0.7541 - val_loss: 0.5418 - val_acc: 0.7437
Epoch 86/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5095 - acc: 0.7544 - val_loss: 0.5177 - val_acc: 0.7529
Epoch 87/100
390870/390870 [==============================] - 27s 70us/step - loss: 0.5082 - acc: 0.7550 - val_loss: 0.5190 - val_acc: 0.7515
Epoch 88/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5073 - acc: 0.7560 - val_loss: 0.5881 - val_acc: 0.7489
Epoch 89/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5073 - acc: 0.7563 - val_loss: 0.5170 - val_acc: 0.7505
Epoch 90/100
390870/390870 [==============================] - 26s 68us/step - loss: 0.5073 - acc: 0.7556 - val_loss: 0.5199 - val_acc: 0.7511
Epoch 91/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5070 - acc: 0.7558 - val_loss: 0.5236 - val_acc: 0.7519
Epoch 92/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5073 - acc: 0.7557 - val_loss: 0.5192 - val_acc: 0.7510
Epoch 93/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5066 - acc: 0.7564 - val_loss: 0.5225 - val_acc: 0.7478
Epoch 94/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5062 - acc: 0.7567 - val_loss: 0.5167 - val_acc: 0.7525
Epoch 95/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5058 - acc: 0.7565 - val_loss: 0.5262 - val_acc: 0.7433
Epoch 96/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5059 - acc: 0.7566 - val_loss: 0.5157 - val_acc: 0.7535
Epoch 97/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5055 - acc: 0.7569 - val_loss: 0.5194 - val_acc: 0.7516
Epoch 98/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5059 - acc: 0.7561 - val_loss: 0.5254 - val_acc: 0.7454
Epoch 99/100
390870/390870 [==============================] - 26s 66us/step - loss: 0.5057 - acc: 0.7567 - val_loss: 0.5354 - val_acc: 0.7492
Epoch 100/100
390870/390870 [==============================] - 26s 67us/step - loss: 0.5054 - acc: 0.7567 - val_loss: 0.5189 - val_acc: 0.7509
Out[72]:
<keras.callbacks.History at 0x1b77226d3c8>
In [73]:
# Save trained model
classifier.save("../models/trained_deep_neural_network_real.h5")
In [74]:
# evaluation of the model training
evaluation = classifier.evaluate(train_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']), train_set['likes'])
print(evaluation)
558386/558386 [==============================] - 17s 30us/step
[0.5069156529383092, 0.7561507630928384]
In [75]:
del train_set
test_set = _pd.read_pickle('../dataset/m2_n9/model_test_set_3.pickle')
test_set = test_set.drop(columns=[*_cols_bin, *_cols_std])
test_set.shape
Out[75]:
(153993, 754)
In [76]:
prediction = classifier.predict(test_set.drop(columns = ['likes', 'stars_review', 'review_id', 'user_id', 'business_id']))
binary_prediction = _binarize(prediction, threshold = 0.5)
binary_prediction
Out[76]:
array([[1.],
       [0.],
       [1.],
       ...,
       [0.],
       [1.],
       [0.]], dtype=float32)
In [77]:
set(binary_prediction[:,0])
Out[77]:
{0.0, 1.0}
In [78]:
print("Report for Deep Learning approach:")
print(_classification_report(test_set['likes'], binary_prediction))

print("Accuracy for Deep Learning approach:", _accuracy_score(test_set['likes'], binary_prediction) * 100)
Report for Deep Learning approach:
              precision    recall  f1-score   support

           0       0.70      0.39      0.51     50930
           1       0.75      0.92      0.83    103063

    accuracy                           0.74    153993
   macro avg       0.73      0.66      0.67    153993
weighted avg       0.74      0.74      0.72    153993

Accuracy for Deep Learning approach: 74.43974726123915
In [79]:
matrix = _confusion_matrix(test_set['likes'], binary_prediction)
print(matrix)
[[20079 30851]
 [ 8510 94553]]
In [80]:
# draw ROC curve
fpr, tpr, thresholds = _roc_curve(test_set['likes'], binary_prediction)

_plt.plot(fpr, tpr)
_plt.xlim([0.0, 1.0])
_plt.ylim([0.0, 1.0])

_plt.title("Neural Network ROC curve")
_plt.xlabel("False Positive")
_plt.ylabel("True Positive")

_plt.grid(True)
_plt.show()
In [81]:
_del_all()